WorldWideScience

Sample records for preliminary tests probability

  1. On Preliminary Test Estimator for Median

    OpenAIRE

    Okazaki, Takeo; 岡崎, 威生

    1990-01-01

    The purpose of the present paper is to discuss about estimation of median with a preliminary test. Two procedures are presented, one uses Median test and the other uses Wilcoxon two-sample test for the preliminary test. Sections 3 and 4 give mathematical formulations of such properties, including mean square errors with one specified case. Section 5 discusses their optimal significance levels of the preliminary test and proposes their numerical values by Monte Carlo method. In addition to mea...

  2. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  3. Preliminary results of steel containment vessel model test

    International Nuclear Information System (INIS)

    Matsumoto, T.; Komine, K.; Arai, S.

    1997-01-01

    A high pressure test of a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of a steel containment vessel (SCV), representing an improved boiling water reactor (BWR) Mark II containment, was conducted on December 11-12, 1996 at Sandia National Laboratories. This paper describes the preliminary results of the high pressure test. In addition, the preliminary post-test measurement data and the preliminary comparison of test data with pretest analysis predictions are also presented

  4. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  5. Test-retest reliability of the Middlesex Assessment of Mental State (MEAMS): a preliminary investigation in people with probable dementia.

    Science.gov (United States)

    Powell, T; Brooker, D J; Papadopolous, A

    1993-05-01

    Relative and absolute test-retest reliability of the MEAMS was examined in 12 subjects with probable dementia and 12 matched controls. Relative reliability was good. Measures of absolute reliability showed scores changing by up to 3 points over an interval of a week. A version effect was found to be in evidence.

  6. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  7. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  8. A Preliminary Analysis of Reactor Performance Test (LOEP) for a Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonil; Park, Su-Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The final phase of commissioning is reactor performance test, which is to prove the integrated performance and safety of the research reactor at full power with fuel loaded such as neutron power calibration, Control Absorber Rod/Second Shutdown Rod drop time, InC function test, Criticality, Rod worth, Core heat removal with natural mechanism, and so forth. The last test will be safety-related one to assure the result of the safety analysis of the research reactor is marginal enough to be sure about the nuclear safety by showing the reactor satisfies the acceptance criteria of the safety functions such as for reactivity control, maintenance of auxiliaries, reactor pool water inventory control, core heat removal, and confinement isolation. After all, the fuel integrity will be ensured by verifying there is no meaningful change in the radiation levels. To confirm the performance of safety equipment, loss of normal electric power (LOEP), possibly categorized as Anticipated Operational Occurrence (AOO), is selected as a key experiment to figure out how safe the research reactor is before turning over the research reactor to the owner. This paper presents a preliminary analysis of the reactor performance test (LOEP) for a research reactor. The results showed how different the transient between conservative estimate and best estimate will look. Preliminary analyses have shown all probable thermal-hydraulic transient behavior of importance as to opening of flap valve, minimum critical heat flux ratio, the change of flow direction, and important values of thermal-hydraulic parameters.

  9. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  10. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  11. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  12. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  13. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  14. Bayesian noninferiority test for 2 binomial probabilities as the extension of Fisher exact test.

    Science.gov (United States)

    Doi, Masaaki; Takahashi, Fumihiro; Kawasaki, Yohei

    2017-12-30

    Noninferiority trials have recently gained importance for the clinical trials of drugs and medical devices. In these trials, most statistical methods have been used from a frequentist perspective, and historical data have been used only for the specification of the noninferiority margin Δ>0. In contrast, Bayesian methods, which have been studied recently are advantageous in that they can use historical data to specify prior distributions and are expected to enable more efficient decision making than frequentist methods by borrowing information from historical trials. In the case of noninferiority trials for response probabilities π 1 ,π 2 , Bayesian methods evaluate the posterior probability of H 1 :π 1 >π 2 -Δ being true. To numerically calculate such posterior probability, complicated Appell hypergeometric function or approximation methods are used. Further, the theoretical relationship between Bayesian and frequentist methods is unclear. In this work, we give the exact expression of the posterior probability of the noninferiority under some mild conditions and propose the Bayesian noninferiority test framework which can flexibly incorporate historical data by using the conditional power prior. Further, we show the relationship between Bayesian posterior probability and the P value of the Fisher exact test. From this relationship, our method can be interpreted as the Bayesian noninferior extension of the Fisher exact test, and we can treat superiority and noninferiority in the same framework. Our method is illustrated through Monte Carlo simulations to evaluate the operating characteristics, the application to the real HIV clinical trial data, and the sample size calculation using historical data. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Impact of proof test interval and coverage on probability of failure of safety instrumented function

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Hu, Bin; Wang, Xiaodong

    2016-01-01

    Highlights: • Introduction of proof test coverage makes the calculation of the probability of failure for SIF more accurate. • The probability of failure undetected by proof test is independently defined as P TIF and calculated. • P TIF is quantified using reliability block diagram and simple formula of PFD avg . • Improving proof test coverage and adopting reasonable test period can reduce the probability of failure for SIF. - Abstract: Imperfection of proof test can result in the safety function failure of safety instrumented system (SIS) at any time in its life period. IEC61508 and other references ignored or only elementarily analyzed the imperfection of proof test. In order to further study the impact of the imperfection of proof test on the probability of failure for safety instrumented function (SIF), the necessity of proof test and influence of its imperfection on system performance was first analyzed theoretically. The probability of failure for safety instrumented function resulted from the imperfection of proof test was defined as probability of test independent failures (P TIF ), and P TIF was separately calculated by introducing proof test coverage and adopting reliability block diagram, with reference to the simplified calculation formula of average probability of failure on demand (PFD avg ). Research results show that: the shorter proof test period and the higher proof test coverage indicate the smaller probability of failure for safety instrumented function. The probability of failure for safety instrumented function which is calculated by introducing proof test coverage will be more accurate.

  16. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  17. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  18. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  19. Preliminary test results for post irradiation examination on the HTTR fuel

    International Nuclear Information System (INIS)

    Ueta, Shohei; Umeda, Masayuki; Sawa, Kazuhiro; Sozawa, Shizuo; Shimizu, Michio; Ishigaki, Yoshinobu; Obata, Hiroyuki

    2007-01-01

    The future post-irradiation program for the first-loading fuel of the HTTR is scheduled using the HTTR fuel handling facilities and the Hot Laboratory in the Japan Materials Testing Reactor (JMTR) to confirm its irradiation resistance and to obtain data on its irradiation characteristics in the core. This report describes the preliminary test results and the future plan for a post-irradiation examination for the HTTR fuel. In the preliminary test, fuel compacts made with the same SiC-coated fuel particle as the first loading fuel were used. In the preliminary test, dimension, weight, fuel failure fraction, and burnup were measured, and X-ray radiograph, SEM, and EPMA observations were carried out. Finally, it was confirmed that the first-loading fuel of the HTTR showed good quality under an irradiation condition. The future plan for the post-irradiation tests was described to confirm its irradiation performance and to obtain data on its irradiation characteristics in the HTTR core. (author)

  20. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  1. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  2. Limited test data: The choice between confidence limits and inverse probability

    International Nuclear Information System (INIS)

    Nichols, P.

    1975-01-01

    For a unit which has been successfully designed to a high standard of reliability, any test programme of reasonable size will result in only a small number of failures. In these circumstances the failure rate estimated from the tests will depend on the statistical treatment applied. When a large number of units is to be manufactured, an unexpected high failure rate will certainly result in a large number of failures, so it is necessary to guard against optimistic unrepresentative test results by using a confidence limit approach. If only a small number of production units is involved, failures may not occur even with a higher than expected failure rate, and so one may be able to accept a method which allows for the possibility of either optimistic or pessimistic test results, and in this case an inverse probability approach, based on Bayes' theorem, might be used. The paper first draws attention to an apparently significant difference in the numerical results from the two methods, particularly for the overall probability of several units arranged in redundant logic. It then discusses a possible objection to the inverse method, followed by a demonstration that, for a large population and a very reasonable choice of prior probability, the inverse probability and confidence limit methods give the same numerical result. Finally, it is argued that a confidence limit approach is overpessimistic when a small number of production units is involved, and that both methods give the same answer for a large population. (author)

  3. Estimation of the common cause failure probabilities of the components under mixed testing schemes

    International Nuclear Information System (INIS)

    Kang, Dae Il; Hwang, Mee Jeong; Han, Sang Hoon

    2009-01-01

    For the case where trains or channels of standby safety systems consisting of more than two redundant components are tested in a staggered manner, the standby safety components within a train can be tested simultaneously or consecutively. In this case, mixed testing schemes, staggered and non-staggered testing schemes, are used for testing the components. Approximate formulas, based on the basic parameter method, were developed for the estimation of the common cause failure (CCF) probabilities of the components under mixed testing schemes. The developed formulas were applied to the four redundant check valves of the auxiliary feed water system as a demonstration study for their appropriateness. For a comparison, we estimated the CCF probabilities of the four redundant check valves for the mixed, staggered, and non-staggered testing schemes. The CCF probabilities of the four redundant check valves for the mixed testing schemes were estimated to be higher than those for the staggered testing scheme, and lower than those for the non-staggered testing scheme.

  4. Estimation of the common cause failure probabilities on the component group with mixed testing scheme

    International Nuclear Information System (INIS)

    Hwang, Meejeong; Kang, Dae Il

    2011-01-01

    Highlights: ► This paper presents a method to estimate the common cause failure probabilities on the common cause component group with mixed testing schemes. ► The CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing. ► There are many CCCGs with specific mixed testing schemes in real plant operation. ► Therefore, a general formula which is applicable to both alternate periodic testing scheme and train level mixed testing scheme was derived. - Abstract: This paper presents a method to estimate the common cause failure (CCF) probabilities on the common cause component group (CCCG) with mixed testing schemes such as the train level mixed testing scheme or the alternate periodic testing scheme. In the train level mixed testing scheme, the components are tested in a non-staggered way within the same train, but the components are tested in a staggered way between the trains. The alternate periodic testing scheme indicates that all components in the same CCCG are tested in a non-staggered way during the planned maintenance period, but they are tested in a staggered way during normal plant operation. Since the CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing, CCF estimators have two kinds of formulas in accordance with the testing schemes. Thus, there are general formulas to estimate the CCF probability on the staggered testing scheme and non-staggered testing scheme. However, in real plant operation, there are many CCCGs with specific mixed testing schemes. Recently, Barros () and Kang () proposed a CCF factor estimation method to reflect the alternate periodic testing scheme and the train level mixed testing scheme. In this paper, a general formula which is applicable to both the alternate periodic testing scheme and the train level mixed testing scheme was derived.

  5. Seismic proving test of ultimate piping strength (current status of preliminary tests)

    International Nuclear Information System (INIS)

    Suzuki, K.; Namita, Y.; Abe, H.; Ichihashi, I.; Suzuki, K.; Ishiwata, M.; Fujiwaka, T.; Yokota, H.

    2001-01-01

    In 1998 Fiscal Year, the 6 year program of piping tests was initiated with the following objectives: i) to clarify the elasto-plastic response and ultimate strength of nuclear piping, ii) to ascertain the seismic safety margin of the current seismic design code for piping, and iii) to assess new allowable stress rules. In order to resolve extensive technical issues before proceeding on to the seismic proving test of a large-scale piping system, a series of preliminary tests of materials, piping components and simplified piping systems is intended. In this paper, the current status of the material tests and the piping component tests is reported. (author)

  6. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  7. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  8. GRIST-2 preliminary test plan and requirements for fuel fabrication and preirradiation

    International Nuclear Information System (INIS)

    Tang, I.M.; Harmon, D.P.; Torri, A.

    1978-12-01

    The preliminary version of the GRIST-2 test plan has been developed for the planned initial 5 years (1984 to 1989) of TREAT-Upgrade in-pile tests. These tests will be employed to study the phenomenology and integral behavior of GCFR core disruptive accidents (CDAs) and to support the Final Safety Analysis Report (FSAR) CDA analyses for the demonstration plant licensing. The preliminary test plan is outlined. Test Phases I and II are for the fresh fuel (preconditioned or not) CDA behavior at the beginning-of-life (BOL) reactor state. Phase III is for the reactor state that contains irradiated fuel with a saturated content of helium and fission gas. Phase IV is for larger bundle tests and scaling effects

  9. Preliminary Tests of a New Low-Cost Photogrammetric System

    Science.gov (United States)

    Santise, M.; Thoeni, K.; Roncella, R.; Sloan, S. W.; Giacomini, A.

    2017-11-01

    This paper presents preliminary tests of a new low-cost photogrammetric system for 4D modelling of large scale areas for civil engineering applications. The system consists of five stand-alone units. Each of the units is composed of a Raspberry Pi 2 Model B (RPi2B) single board computer connected to a PiCamera Module V2 (8 MP) and is powered by a 10 W solar panel. The acquisition of the images is performed automatically using Python scripts and the OpenCV library. Images are recorded at different times during the day and automatically uploaded onto a FTP server from where they can be accessed for processing. Preliminary tests and outcomes of the system are discussed in detail. The focus is on the performance assessment of the low-cost sensor and the quality evaluation of the digital surface models generated by the low-cost photogrammetric systems in the field under real test conditions. Two different test cases were set up in order to calibrate the low-cost photogrammetric system and to assess its performance. First comparisons with a TLS model show a good agreement.

  10. PRELIMINARY TESTS OF A NEW LOW-COST PHOTOGRAMMETRIC SYSTEM

    Directory of Open Access Journals (Sweden)

    M. Santise

    2017-11-01

    Full Text Available This paper presents preliminary tests of a new low-cost photogrammetric system for 4D modelling of large scale areas for civil engineering applications. The system consists of five stand-alone units. Each of the units is composed of a Raspberry Pi 2 Model B (RPi2B single board computer connected to a PiCamera Module V2 (8 MP and is powered by a 10 W solar panel. The acquisition of the images is performed automatically using Python scripts and the OpenCV library. Images are recorded at different times during the day and automatically uploaded onto a FTP server from where they can be accessed for processing. Preliminary tests and outcomes of the system are discussed in detail. The focus is on the performance assessment of the low-cost sensor and the quality evaluation of the digital surface models generated by the low-cost photogrammetric systems in the field under real test conditions. Two different test cases were set up in order to calibrate the low-cost photogrammetric system and to assess its performance. First comparisons with a TLS model show a good agreement.

  11. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  12. Preliminary project of installation for separation tubes tests-ITTS

    International Nuclear Information System (INIS)

    Rocha, Z.

    1984-01-01

    A consolidation of actual ideas about installation, entitled ''Installation to separation tubes tests-ITTS'', expected to CDTN is presented. The project bases, the testing to be realized, the procedures to be obeyed during the operation, the components and the space required by installation and auxiliary equipments, the presumable origin of components (nacional and international), including a preliminary list of building and operation costs are described. (author) [pt

  13. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  14. Post-test probability for neonatal hyperbilirubinemia based on umbilical cord blood bilirubin, direct antiglobulin test, and ABO compatibility results.

    Science.gov (United States)

    Peeters, Bart; Geerts, Inge; Van Mullem, Mia; Micalessi, Isabel; Saegeman, Veroniek; Moerman, Jan

    2016-05-01

    Many hospitals opt for early postnatal discharge of newborns with a potential risk of readmission for neonatal hyperbilirubinemia. Assays/algorithms with the possibility to improve prediction of significant neonatal hyperbilirubinemia are needed to optimize screening protocols and safe discharge of neonates. This study investigated the predictive value of umbilical cord blood (UCB) testing for significant hyperbilirubinemia. Neonatal UCB bilirubin, UCB direct antiglobulin test (DAT), and blood group were determined, as well as the maternal blood group and the red blood cell antibody status. Moreover, in newborns with clinically apparent jaundice after visual assessment, plasma total bilirubin (TB) was measured. Clinical factors positively associated with UCB bilirubin were ABO incompatibility, positive DAT, presence of maternal red cell antibodies, alarming visual assessment and significant hyperbilirubinemia in the first 6 days of life. UCB bilirubin performed clinically well with an area under the receiver-operating characteristic curve (AUC) of 0.82 (95 % CI 0.80-0.84). The combined UCB bilirubin, DAT, and blood group analysis outperformed results of these parameters considered separately to detect significant hyperbilirubinemia and correlated exponentially with hyperbilirubinemia post-test probability. Post-test probabilities for neonatal hyperbilirubinemia can be calculated using exponential functions defined by UCB bilirubin, DAT, and ABO compatibility results. • The diagnostic value of the triad umbilical cord blood bilirubin measurement, direct antiglobulin testing and blood group analysis for neonatal hyperbilirubinemia remains unclear in literature. • Currently no guideline recommends screening for hyperbilirubinemia using umbilical cord blood. What is New: • Post-test probability for hyperbilirubinemia correlated exponentially with umbilical cord blood bilirubin in different risk groups defined by direct antiglobulin test and ABO blood group

  15. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  16. Preliminary test conditions for KNGR SBLOCA DVI ECCS performance test

    International Nuclear Information System (INIS)

    Bae, Kyoo Whan; Song, Jin Ho; Chung, Young Jong; Sim, Suk Ku; Park, Jong Kyun

    1999-03-01

    The Korean Next Generation Reactor (KNGR) adopts 4-train Direct Vessel Injection (DVI) configuration and injects the safety injection water directly into the downcomer through the 8.5'' DVI nozzle. Thus, the thermal hydraulic phenomena such as ECCS mixing and bypass are expected to be different from those observed in the cold leg injection. In order to investigate the realistic injection phenomena and modify the analysis code developed in the basis of cold leg injection, thermal hydraulic test with the performance evaluation is required. Preliminarily, the sequence of events and major thermal hydraulic phenomena during the small break LOCA for KNGR are identified from the analysis results calculated by the CEFLASH-4AS/REM. It is shown from the analysis results that the major transient behaviors including the core mixture level are largely affected by the downcomer modeling. Therefore, to investigate the proper thermal hydraulic phenomena occurring in the downcomer with limited budget and time, the separate effects test focusing on this region is considered to be effective and the conceptual test facility based on this recommended. For this test facility the test initial and boundary conditions are developed using the CEFLASH-4AS/REM analysis results that will be used as input for the preliminary test requirements. The final test requirements will be developed through the further discussions with the test performance group. (Author). 10 refs., 18 tabs., 4 figs

  17. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  18. Preliminary Test for Nonlinear Input Output Relations in SISO Systems

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2000-01-01

    This paper discusses and develops preliminary statistical tests for detecting nonlinearities in the deterministic part of SISO systems with noise. The most referenced method is unreliable for common noise processes as e.g.\\ colored. Therefore two new methods based on superposition and sinus input...

  19. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  20. Preliminary test results and CFD analysis for moderator circulation test at Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, H.T. [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of); Im, S.H.; Sung, H.J. [Korea Advanced Inst. of Science and Tech., Daejeon (Korea, Republic of); Seo, H.; Bang, I.C. [Ulsan National Inst. of Science and Tech., Ulsan (Korea, Republic of)

    2014-07-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  1. Preliminary test results and CFD analysis for moderator circulation test at Korea

    International Nuclear Information System (INIS)

    Kim, H.T.; Im, S.H.; Sung, H.J.; Seo, H.; Bang, I.C.

    2014-01-01

    Korea Atomic Energy Research Institute (KAERI) is carrying out a scaled-down moderator test program to simulate the CANDU6 moderator circulation phenomena during steady state operation and accident conditions. This research program includes the construction of the Moderator Circulation Test (MCT) facility, production of the validation data for self-reliant CFD tools, and development of optical measurement system using the Particle Image Velocimetry (PIV). The MCT facility includes a primary circulation loop (pipe lines, a primary side pump, a heat exchanger, valves, flow meters) and a secondary side loop (pipe lines, a secondary side pump, and an external cooling tower). The loop leakage test and non-heating test are performed in the present work. In the present work the PIV technique is used to measure the velocity distributions in the scaled moderator tank of MCT under iso-thermal test conditions. The preliminary PIV measurement data are obtained and compared with CFX code predictions. (author)

  2. A closer look at the effect of preliminary goodness-of-fit testing for normality for the one-sample t-test.

    Science.gov (United States)

    Rochon, Justine; Kieser, Meinhard

    2011-11-01

    Student's one-sample t-test is a commonly used method when inference about the population mean is made. As advocated in textbooks and articles, the assumption of normality is often checked by a preliminary goodness-of-fit (GOF) test. In a paper recently published by Schucany and Ng it was shown that, for the uniform distribution, screening of samples by a pretest for normality leads to a more conservative conditional Type I error rate than application of the one-sample t-test without preliminary GOF test. In contrast, for the exponential distribution, the conditional level is even more elevated than the Type I error rate of the t-test without pretest. We examine the reasons behind these characteristics. In a simulation study, samples drawn from the exponential, lognormal, uniform, Student's t-distribution with 2 degrees of freedom (t(2) ) and the standard normal distribution that had passed normality screening, as well as the ingredients of the test statistics calculated from these samples, are investigated. For non-normal distributions, we found that preliminary testing for normality may change the distribution of means and standard deviations of the selected samples as well as the correlation between them (if the underlying distribution is non-symmetric), thus leading to altered distributions of the resulting test statistics. It is shown that for skewed distributions the excess in Type I error rate may be even more pronounced when testing one-sided hypotheses. ©2010 The British Psychological Society.

  3. Preliminary hazard analysis for the Brayton Isotope Ground Demonstration System (including vacuum test chamber)

    International Nuclear Information System (INIS)

    Miller, L.G.

    1975-01-01

    The Preliminary Hazard Analysis (PHA) of the BIPS-GDS is a tabular summary of hazards and undesired events which may lead to system damage or failure and/or hazard to personnel. The PHA reviews the GDS as it is envisioned to operate in the Vacuum Test Chamber (VTC) of the GDS Test Facility. The VTC and other equipment which will comprise the test facility are presently in an early stage of preliminary design and will undoubtedly undergo numerous changes before the design is frozen. The PHA and the FMECA to follow are intended to aid the design effort by identifying areas of concern which are critical to the safety and reliability of the BIPS-GDS and test facility

  4. Preliminary results of a test of a longitudinal phase-space monitor

    International Nuclear Information System (INIS)

    Kikutani, Eiji; Funakoshi, Yoshihiro; Kawamoto, Takashi; Mimashi, Toshihiro

    1994-01-01

    A prototype of a longitudinal phase-space monitor has been developed in TRISTAN Main Ring at KEK. The principle of the monitor and its basic components are explained. Also a result of a preliminary beam test is given. (author)

  5. Preliminary tests on a new near-infrared continuous-wave tissue oximeter

    Science.gov (United States)

    Casavola, Claudia; Cicco, Giuseppe; Pirrelli, Anna; Lugara, Pietro M.

    2000-11-01

    We present a preliminary study, in vitro and in vivo, with a novel device for near-infrared tissue oximetry. The light sources used are two quasi-continuous-wave LEDs, emitting at 656 and 851 nm, and the detector is a photodiode. The data are acquired in back-scattering configuration, thus allowing the non-invasive characterization of thick tissues. Stability tests were performed by placing the optical probe on a tissue- like phantom and acquiring data for periods of time ranging from 5 to 40 minutes. No significant drifts in the DC signal were observed after a warm-up period of no more than 10 minutes. We performed reproducibility tests by repositioning the optical probe on the phantom for a number of times. We found a reproducibility better than 5% in the DC signal. We also present the results of a preliminary study conducted in vivo, on the calf muscle of human subjects. We report a comparison of the results obtained with the near-infrared oximeter with the values of blood oxygenation ctO2 measured with conventional chemical tests.

  6. Preliminary Test on Hydraulic Rotation Device for Neutron Transmutation Doping

    International Nuclear Information System (INIS)

    Park, Ki-Jung; Kang, Han-Ok; Kim, Seong Hoon; Park, Cheol

    2014-01-01

    The Korea Atomic Energy Research Institute (KAERI) is developing a new Research Reactor (KJRR) which will be located at KIJANG in the south-eastern province of Korea. The KJRR will be mainly utilized for isotope production, NTD production, and the related research activities. During the NTD process, the irradiation rig containing the silicon ingot rotates at the constant speed to ensure precisely defined homogeneity of the irradiation. A new NTD Hydraulic Rotation Device (NTDHRD) is being developed to rotate the irradiation rigs at the required speed. In this study, the preliminary test and the analysis for the rotation characteristic of the NTDHRD, which is developed through the conceptual design, are described. A new NTD hydraulic rotation device is being developed for the purpose of application to the KIJANG research reactor (KJRR). The preliminary test and analysis for the rotation characteristic of the NTDHRD, which is developed through the conceptual design, are conducted in experimental apparatus. The film thickness by the thrust bearing is measured and the minimum required mass flow rate for stable rotation is determined

  7. Preliminary results of the round-robin testing of F82H

    Energy Technology Data Exchange (ETDEWEB)

    Shiba, K.; Yamanouchi, N.; Tohyama, A.

    1996-10-01

    Preliminary results of metallurgical, physical and mechanical properties of low activation ferritic steel F82H (IEA heat) were obtained in the round-robin test in Japan. The properties of IEA heat F82H were almost the same as the original F82H.

  8. Preliminary design for hot dirty-gas control-valve test facility. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    This report presents the results of a preliminary design and cost estimating effort for a facility for the testing of control valves in Hot Dirty Gas (HDGCV) service. This design was performed by Mittelhauser Corporation for the United States Department of Energy's Morgantown Energy Technology Center (METC). The objective of this effort was to provide METC with a feasible preliminary design for a test facility which could be used to evaluate valve designs under simulated service conditions and provide a technology data base for DOE and industry. In addition to the actual preliminary design of the test facility, final design/construction/operating schedules and a facility cost estimate were prepared to provide METC sufficient information with which to evaluate this design. The bases, assumptions, and limitations of this study effort are given. The tasks carried out were as follows: METC Facility Review, Environmental Control Study, Gas Generation Study, Metallurgy Review, Safety Review, Facility Process Design, Facility Conceptual Layout, Instrumentation Design, Cost Estimates, and Schedules. The report provides information regarding the methods of approach used in the various tasks involved in the completion of this study. Section 5.0 of this report presents the results of the study effort. The results obtained from the above-defined tasks are described briefly. The turnkey cost of the test facility is estimated to be $9,774,700 in fourth quarter 1979 dollars, and the annual operating cost is estimated to be $960,000 plus utilities costs which are not included because unit costs per utility were not available from METC.

  9. The picture test of separation and individuation - preliminary research

    Directory of Open Access Journals (Sweden)

    Gregor Žvelc

    2000-06-01

    Full Text Available Authors introduce a new instrument, which they developed for measuring separation and individuation process and attachment in adolescence and adulthood. The Picture Test of Separation and Individuation (PTSI is a semi–projective test. It consists of various pictures, which represent relationships with significant others. PTSI is divided into three subtests: Relationship with Mother, Relationship with Father and Attachment. In a preliminary research on a sample of college and university students authors studied basic properties of the test. The results of the research indicate that PTSI is consistent with theoretical background, has good sensitivity and is economical. The Picture Test of Separation and Individuation enables quick but complex insight into individual's relationships with significant others as well as into his/her stage of separation and individuation process. Considering satisfying results of pilot study, authors suggest further research for validation of the test.

  10. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  11. Thick Concrete Specimen Construction, Testing, and Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoegh, Kyle [Univ. of Minnesota, Minneapolis, MN (United States); Khazanovich, Lev [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-03-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations. A preliminary report detailed some of the challenges associated with thick reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures for using in NDE evaluation comparisons. This led to the construction of the concrete specimen presented in this report, which has sufficient reinforcement density and cross-sectional size to represent an NPP containment wall. Details on how a suitably thick concrete specimen was constructed are presented, including the construction materials, final nominal design schematic, as well as formwork and rigging required to safely meet the desired dimensions of the concrete structure. The report also details the type and methods of forming the concrete specimen as well as information on how the rebar and simulated defects were embedded. Details on how the resulting specimen was transported, safely anchored, and marked to allow access for systematic comparative NDE testing of defects in a representative NPP containment wall concrete specimen are also given. Data collection using the MIRA Ultrasonic NDE equipment and

  12. [New visual field testing possibilities (a preliminary report)].

    Science.gov (United States)

    Erichev, V P; Ermolaev, A P; Antonov, A A; Grigoryan, G L; Kosova, D V

    2018-01-01

    There are currently no portable mobile perimeters that allow visual field testing outside ophthalmologist's examination rooms. To develop a mobile perimetry technique based on use of a virtual reality headset (VR). The study involved 26 patients (30 eyes) with II-III stage primary open-angle glaucoma (POAG) with compensated IOP. Perimetry was performed for each patient twice - on Humphrey analyzer (test 30-2, 76 points) and employing similar strategy on a perimeter integrated into VR headset (Total Vision, Russia). Visual field testing was performed with an interval from 1 hour to 3 days. The results were comparatively analyzed. Patients tolerated the examination well. Comparative analysis of preliminary perimetry results obtained with both methods showed high degree of identity, so the results were concluded to be comparable. By visually isolating the wearer, VR headset achieves elimination of distractions and stable light conditions for visual field testing. The headset-perimeter is compact, mobile, easily transportable, can be used in the work of visiting medical teams and for examination at home.

  13. Preliminary observations of gate valve flow interruption tests, Phase 2

    International Nuclear Information System (INIS)

    Steele, R. Jr.; DeWall, K.G.

    1990-01-01

    This paper presents preliminary observations from the US Nuclear Regulatory Commission/Idaho National Engineering Laboratory Flexible Wedge Gate Valve Qualification and Flow Interruption Test Program, Phase 2. The program investigated the ability of selected boiling water reactor (BWR) process line valves to perform their containment isolation function at high energy pipe break conditions and other more normal flow conditions. The fluid and valve operating responses were measured to provide information concerning valve and operator performance at various valve loadings so that the information could be used to assess typical nuclear industry motor operator sizing equations. Six valves were tested, three 6-in. isolation valves representative of those used in reactor water cleanup systems in BWRs and three 10-in. isolation valves representative of those used in BWR high pressure coolant injection (HPCI) steam lines. The concern with these normally open isolation valves is whether they will close in the event of a downstream pipe break outside of containment. The results of this testing will provide part of the technical insights for NRC efforts regarding Generic Issue 87 (GI-87), Failure of the HPCI Steam Line Without Isolation, which includes concerns about the uncertainties in gate valve motor operator sizing and torque switch settings for these BWR containment isolation valves. As of this writing, the Phase 2 test program has just been completed. Preliminary observations made in the field confirmed most of the results from the Phase 1 test program. All six valves closing in high energy water, high energy steam, and high pressure cold water require more force to close than would be calculated using the typical variables in the standard industry motor operator sizing equations

  14. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Probability of background to produce a signal-like excess, for all Higgs masses tested.

    CERN Document Server

    ATLAS, collaboration

    2012-01-01

    The probability of background to produce a signal-like excess, for all the Higgs boson masses tested. At almost all masses, the probability (solid curve) is at least a few percent; however, at 126.5 GeV it dips to 3x10-7, or one chance in three million, the '5-sigma' gold-standard normally used for the discovery of a new particle. A Standard Model Higgs boson with that mass would produce a dip to 4.6 sigma.

  16. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  17. Radiation Testing on State-of-the-Art CMOS: Challenges, Plans, and Preliminary Results

    Science.gov (United States)

    LaBel, Kenneth A.; Cohn, Lewis M.

    2009-01-01

    At GOMAC 2007 and 2008, we discussed a variety of challenges for radiation testing of modern semiconductor devices and technologies [1, 2]. In this presentation, we provide more specific details in this on-going investigation focusing on out-of-the-box lessons observed for providing radiation effects assurances as well as preliminary test results.

  18. An Estimation of a Passive Infra-Red Sensor Probability of Detection

    International Nuclear Information System (INIS)

    Osman, E.A.; El-Gazar, M.I.; Shaat, M.K.; El-Kafas, A.A.; Zidan, W.I.; Wadoud, A.A.

    2009-01-01

    Passive Infera-Red (PIR) sensors are one of many detection sensors are used to detect any intrusion process of the nuclear sites. In this work, an estimation of a PIR Sensor's Probability of Detection of a hypothetical facility is presented. sensor performance testing performed to determine whether a particular sensor will be acceptable in a proposed design. We have access to a sensor test field in which the sensor of interest is already properly installed and the parameters have been set to optimal levels by preliminary testing. The PIR sensor construction, operation and design for the investigated nuclear site are explained. Walking and running intrusion tests were carried out inside the field areas of the PIR sensor to evaluate the sensor performance during the intrusion process. 10 trials experimentally performed for achieving the intrusion process via a passive infra-red sensor's network system. The performance and intrusion senses of PIR sensors inside the internal zones was recorded and evaluated.

  19. Preliminary data evaluation for thermal insulation characterization testing

    International Nuclear Information System (INIS)

    DeClue, J.F.; Moses, S.D.; Tollefson, D.A.

    1991-01-01

    The purpose of Thermal Insulation Characterization Testing is to provide physical data to support certain assumptions and calculational techniques used in the criticality safety calculations in Section 6 of the Safety Analysis Reports for Packaging (SARPs) for drum-type packaging for Department of Energy's (DOE) Oak Ridge Y-12 Plant, managed by Martin Marietta Energy Systems, Inc. Results of preliminary data evaluation regarding the fire-test condition reveal that realistic weight loss consideration and residual material characterization in developing calculational models for the hypothetical accident condition is necessary in order to prevent placement of unduly conservative restrictions on shipping requirements as a result of overly conservative modeling. This is particularly important for fast systems. Determination of the geometric arrangement of residual material is of secondary importance. Both the methodology used to determine the minimum thermal insulation mass remaining after the fire test and the treatment of the thermal insulation in the criticality safety calculational models requires additional evaluation. Specific testing to be conducted will provide experimental data with which to validate the mass estimates and calculational modeling techniques for extrapolation to generic drum-type containers

  20. Hydraulically driven control rod concept for integral reactors: fluid dynamic simulation and preliminary test

    International Nuclear Information System (INIS)

    Ricotti, M.E.; Cammi, A.; Lombardi, C.; Passoni, M.; Rizzo, C.; Carelli, M.; Colombo, E.

    2003-01-01

    The paper deals with the preliminary study of the Hydraulically Driven Control Rod concept, tailored for PWR control rods (spider type) with hydraulic drive mechanism completely immersed in the primary water. A specific solution suitable for advanced versions of the IRIS integral reactor is under investigation. The configuration of the Hydraulic Control Rod device, made up by an external movable piston and an internal fixed cylinder, is described. After a brief description of the whole control system, particular attention is devoted to the Control Rod characterization via Computational Fluid Dynamics (CFD) analysis. The investigation of the system behavior, including dynamic equilibrium and stability properties, has been carried out. Finally, preliminary tests were performed in a low pressure, low temperature, reduced length experimental facility. The results are compared with the dynamic control model and CFD simulation model, showing good agreement between simulations and experimental data. During these preliminary tests, the control system performs correctly, allowing stable dynamic equilibrium positions for the Control Rod and stable behavior during withdrawal and insertion steps. (author)

  1. Preliminary Calculations of Bypass Flow Distribution in a Multi-Block Air Test

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Tak, Nam Il

    2011-01-01

    The development of a methodology for the bypass flow assessment in a prismatic VHTR (Very High Temperature Reactor) core has been conducted at KAERI. A preliminary estimation of variation of local bypass flow gap size between graphite blocks in the NHDD core were carried out. With the predicted gap sizes, their influence on the bypass flow distribution and the core hot spot was assessed. Due to the complexity of gap distributions, a system thermo-fluid analysis code is suggested as a tool for the core thermo-fluid analysis, the model and correlations of which should be validated. In order to generate data for validating the bypass flow analysis model, an experimental facility for a multi-block air test was constructed at Seoul National University (SNU). This study is focused on the preliminary evaluation of flow distribution in the test section to understand how the flow is distributed and to help the selection of experimental case. A commercial CFD code, ANSYS CFX is used for the analyses

  2. Preliminary investigation on determination of radionuclide distribution in field tracing test site

    International Nuclear Information System (INIS)

    Tanaka, Tadao; Mukai, Masayuki; Takebe, Shinichi; Guo Zede; Li Shushen; Kamiyama, Hideo.

    1993-12-01

    Field tracing tests for radionuclide migration have been conducted by using 3 H, 60 Co, 85 Sr and 134 Cs, in the natural unsaturated loess zone at field test site of China Institute for Radiation Protection. It is necessary to obtain confidable distribution data of the radionuclides in the test site, in order to evaluate exactly the migration behavior of the radionuclides in situ. An available method to determine the distribution was proposed on the basis of preliminary discussing results on sampling method of soils from the test site and analytical method of radioactivity in the soils. (author)

  3. Exploratory shaft facility preliminary designs - Permian Basin

    International Nuclear Information System (INIS)

    1983-09-01

    The purpose of the Preliminary Design Report, Permian Basin, is to provide a description of the preliminary design for an Exploratory Shaft Facility in the Permian Basin, Texas. This issue of the report describes the preliminary design for constructing the exploratory shaft using the Large Hole Drilling method of construction and outlines the preliminary design and estimates of probable construction cost. The Preliminary Design Report is prepared to complement and summarize other documents that comprise the design at the preliminary stage of completion, December 1982. Other design documents include drawings, cost estimates and schedules. The preliminary design drawing package, which includes the construction schedule drawing, depicts the descriptions in this report. For reference, a list of the drawing titles and corresponding numbers are included in the Appendix. The report is divided into three principal sections: Design Basis, Facility Description, and Construction Cost Estimate. 30 references, 13 tables

  4. Results of the Preliminary Test in the 1/4-Scale RCCS of the PMR200 VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong-Hwan; Bae, Yoon-Yeong; Hong, Sung-Deok; Kim, Chan-Soo; Cho, Bong-Hyun; Kim, Min-Hwan [Nuclear Hydrogen Reactor Technology Development Dep., Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The Reactor Cavity Cooling System (RCCS) is a key ex-vessel passive safety system that will ensure the safety of the PMR200, and its performance needs to be verified. For the difficulty of the full-scale test, a 1/4-scale RCCS facility, NACEF (Natural Cooling Experimental Facility), has been constructed at KAERI, and a shakedown test has been performed. A brief design and the preliminary test results of this facility are described. A 1/4-scale RCCS mockup of PMR200, NACEF, was constructed and tested preliminarily. The functioning of the facility worked quite well. Moreover, the preliminary test results show a fairly good agreement with past work except for the conductive heat transfer region in the riser bottom. After a remedy such as the installation of more precise flow meters and a more improved insulation, the test facility is likely to work well.

  5. Preliminary environmental analysis of a geopressured-geothermal test well in Brazoria County, Texas

    Energy Technology Data Exchange (ETDEWEB)

    White, W.A.; McGraw, M.; Gustavson, T.C.; Meriwether, J.

    1977-11-16

    Preliminary environmental data, including current land use, substrate lithology, soils, natural hazards, water resources, biological assemblages, meteorological data, and regulatory considerations have been collected and analyzed for approximately 150 km/sup 2/ of land near Chocolate Bayou, Brazoria County, Texas, in which a geopressured-geothermal test well is to be drilled in the fall of 1977. The study was designed to establish an environmental data base and to determine, within spatial constraints set by subsurface reservoir conditions, environmentally suitable sites for the proposed well. Preliminary analyses of data revealed the eed for focusing on the following areas: potential for subsidence and fault activation, susceptibility of test well and support facilities to fresh- and salt-water flooding, possible effects of produced saline waters on biological assemblages and groundwaer resources, distribution of expansive soils, and effect of drilling and associated support activities on known archeological-cultural resources.

  6. Test for Nonlinear Input Output Relations in SISO Systems by Preliminary Data Analysis

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2000-01-01

    This paper discusses and develops preliminary statistical tests for detecting nonlinearities in the deterministic part of SISO systems with noise. The most referenced method is unreliable for common noise processes as e.g.\\ colored. Therefore two new methods based on superposition and sinus input...

  7. Preliminary engineering specifications for a test demonstration multilayer protective barrier cover system

    International Nuclear Information System (INIS)

    Phillips, S.J.; Gilbert, T.W.; Adams, M.R.

    1985-03-01

    This report presents preliminary engineering specifications for a test protective barrier cover system and support radiohydrology facility to be constructed at the Hanford Protective Barrier Test Facility (PBTF). Construction of this test barrier and related radiohydrology facility is part of a continuing effort to provide construction experience and performance evaluation of alternative barrier designs used for long-term isolation of disposed radioactive waste materials. Design specifications given in this report are tentative, based on interim engineering and computer simulation design efforts. Final definitive design specifications and engineering prints will be produced in FY 1986. 6 refs., 10 figs., 1 tab

  8. Preliminary site design for the SP-100 ground engineering test

    International Nuclear Information System (INIS)

    Cox, C.M.; Miller, W.C.; Mahaffey, M.K.

    1986-04-01

    In November, 1985, Hanford was selected by the Department of Energy (DOE) as the preferred site for a full-scale test of the integrated nuclear subsystem for SP-100. The Hanford Engineering Development Laboratory, operated by Westinghouse Hanford Company, was assigned as the lead contractor for the Test Site. The nuclear subsystem, which includes the reactor and its primary heat transport system, will be provided by the System Developer, another contractor to be selected by DOE in late FY-1986. In addition to reactor operations, test site responsibilities include preparation of the facility plus design, procurement and installation of a vacuum chamber to house the reactor, a secondary heat transport system to dispose of the reactor heat, a facility control system, and postirradiation examination. At the conclusion of the test program, waste disposal and facility decommissioning are required. The test site must also prepare appropriate environmental and safety evaluations. This paper summarizes the preliminary design requirements, the status of design, and plans to achieve full power operation of the test reactor in September, 1990

  9. Jet-Surface Interaction: High Aspect Ratio Nozzle Test, Nozzle Design and Preliminary Data

    Science.gov (United States)

    Brown, Clifford; Dippold, Vance

    2015-01-01

    The Jet-Surface Interaction High Aspect Ratio (JSI-HAR) nozzle test is part of an ongoing effort to measure and predict the noise created when an aircraft engine exhausts close to an airframe surface. The JSI-HAR test is focused on parameters derived from the Turbo-electric Distributed Propulsion (TeDP) concept aircraft which include a high-aspect ratio mailslot exhaust nozzle, internal septa, and an aft deck. The size and mass flow rate limits of the test rig also limited the test nozzle to a 16:1 aspect ratio, half the approximately 32:1 on the TeDP concept. Also, unlike the aircraft, the test nozzle must transition from a single round duct on the High Flow Jet Exit Rig, located in the AeroAcoustic Propulsion Laboratory at the NASA Glenn Research Center, to the rectangular shape at the nozzle exit. A parametric nozzle design method was developed to design three low noise round-to-rectangular transitions, with 8:1, 12:1, and 16: aspect ratios, that minimizes flow separations and shocks while providing a flat flow profile at the nozzle exit. These designs validated using the WIND-US CFD code. A preliminary analysis of the test data shows that the actual flow profile is close to that predicted and that the noise results appear consistent with data from previous, smaller scale, tests. The JSI-HAR test is ongoing through October 2015. The results shown in the presentation are intended to provide an overview of the test and a first look at the preliminary results.

  10. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    of this study was to develop a method to estimate the probability of low within-herd prevalence of paratuberculosis for Danish dairy herds. A stochastic simulation model was developed using the R(R) programming environment. Features of this model included: use of age-specific estimates of test......-sensitivity and specificity; use of a distribution of observed values (rather than a fixed, low value) for design prevalence; and estimates of the probability of low prevalence (Pr-Low) based on a specific number of test-positive animals, rather than for a result less than or equal to a specified cut-point number of reactors....... Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...

  11. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  12. Accelerated stress testing of thin film solar cells: Development of test methods and preliminary results

    Science.gov (United States)

    Lathrop, J. W.

    1985-01-01

    If thin film cells are to be considered a viable option for terrestrial power generation their reliability attributes will need to be explored and confidence in their stability obtained through accelerated testing. Development of a thin film accelerated test program will be more difficult than was the case for crystalline cells because of the monolithic construction nature of the cells. Specially constructed test samples will need to be fabricated, requiring committment to the concept of accelerated testing by the manufacturers. A new test schedule appropriate to thin film cells will need to be developed which will be different from that used in connection with crystalline cells. Preliminary work has been started to seek thin film schedule variations to two of the simplest tests: unbiased temperature and unbiased temperature humidity. Still to be examined are tests which involve the passage of current during temperature and/or humidity stress, either by biasing in the forward (or reverse) directions or by the application of light during stress. Investigation of these current (voltage) accelerated tests will involve development of methods of reliably contacting the thin conductive films during stress.

  13. Preliminary Beam Irradiation Test for RI Production Targets at KOMAC

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Sang Pil; Kwon, Hyeok Jung; Kim, Han Sung; Cho, Yong Sub; Seol, Kyung Tae; Song, Young Gi; Kim, Dae Il; Jung, Myung Hwan; Kim, Kye Ryung; Min, Yi Sub [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The new beamline and target irradiation facility has been constructed for the production of therapeutic radio-isotope. Sr-82 and Cu-67 were selected as the target isotope in this facility, they are promising isotope for the PET imaging and cancer therapy. For the facility commissioning, the irradiation test for the prototype-target was conducted to confirm the feasibility of radio-isotope production, the proto-type targets are made of RbCl pellet and the natural Zn metal for Sr-82 and Cu-67 production respectively, In this paper, an introduction to the RI production targetry system and the results of the preliminary beam irradiation test are discussed. the low-flux beam irradiation tests for proto-type RI target have been conducted. As a result of the beam irradiation tests, we could obtain the evidence of Sr-82 and Cu-67 production, have confirmed the feasibility of Sr-82 and Cu-67 production at KOMAC RI production facility.

  14. Preliminary Beam Irradiation Test for RI Production Targets at KOMAC

    International Nuclear Information System (INIS)

    Yoon, Sang Pil; Kwon, Hyeok Jung; Kim, Han Sung; Cho, Yong Sub; Seol, Kyung Tae; Song, Young Gi; Kim, Dae Il; Jung, Myung Hwan; Kim, Kye Ryung; Min, Yi Sub

    2016-01-01

    The new beamline and target irradiation facility has been constructed for the production of therapeutic radio-isotope. Sr-82 and Cu-67 were selected as the target isotope in this facility, they are promising isotope for the PET imaging and cancer therapy. For the facility commissioning, the irradiation test for the prototype-target was conducted to confirm the feasibility of radio-isotope production, the proto-type targets are made of RbCl pellet and the natural Zn metal for Sr-82 and Cu-67 production respectively, In this paper, an introduction to the RI production targetry system and the results of the preliminary beam irradiation test are discussed. the low-flux beam irradiation tests for proto-type RI target have been conducted. As a result of the beam irradiation tests, we could obtain the evidence of Sr-82 and Cu-67 production, have confirmed the feasibility of Sr-82 and Cu-67 production at KOMAC RI production facility

  15. The Art Gallery Test: A Preliminary Comparison between Traditional Neuropsychological and Ecological VR-Based Tests

    Directory of Open Access Journals (Sweden)

    Pedro Gamito

    2017-11-01

    Full Text Available Ecological validity should be the cornerstone of any assessment of cognitive functioning. For this purpose, we have developed a preliminary study to test the Art Gallery Test (AGT as an alternative to traditional neuropsychological testing. The AGT involves three visual search subtests displayed in a virtual reality (VR art gallery, designed to assess visual attention within an ecologically valid setting. To evaluate the relation between AGT and standard neuropsychological assessment scales, data were collected on a normative sample of healthy adults (n = 30. The measures consisted of concurrent paper-and-pencil neuropsychological measures [Montreal Cognitive Assessment (MoCA, Frontal Assessment Battery (FAB, and Color Trails Test (CTT] along with the outcomes from the three subtests of the AGT. The results showed significant correlations between the AGT subtests describing different visual search exercises strategies with global and specific cognitive measures. Comparative visual search was associated with attention and cognitive flexibility (CTT; whereas visual searches involving pictograms correlated with global cognitive function (MoCA.

  16. Evaluation of the preliminary auditory profile test battery in an international multi-centre study

    NARCIS (Netherlands)

    van Esch, T.E.M.; Kollmeier, B.; Vormann, M.; Lijzenga, J.; Houtgast, T.; Hallgren, M.; Larsby, B.; Athalye, S.P.; Lutman, M.E.; Dreschler, W.A.

    2013-01-01

    Objective: This paper describes the composition and international multi-centre evaluation of a battery of tests termed the preliminary auditory profile. It includes measures of loudness perception, listening effort, speech perception, spectral and temporal resolution, spatial hearing, self-reported

  17. Influence of mechanical stress level in preliminary stress-corrosion testing on fatigue strength of a low-carbon steel

    International Nuclear Information System (INIS)

    Aleskerova, S.A.; Pakharyan, V.A.

    1978-01-01

    Effect of corrosion and mechanical factors of preliminary stress corrosion of a metal in its fatigue strength, has been investigated. Smooth cylindrical samples of 20 steel have been tested. Preliminary corrosion under stress has been carried out under natural sea conditions. It is shown that mechanical stresses in the case of preliminary corrosion affect fatigue strength of low-carbon steels, decreasing the range of limited durability and fatigue limit. This effect increases with the increase of stress level and agressivity of corrosive medium

  18. Preliminary report on fire protection research program (July 6, 1977 test)

    International Nuclear Information System (INIS)

    Klamerus, L.J.

    1977-10-01

    This preliminary report describes a fire test performed at Sandia Laboratories on an array of cable trays filled with fire retardant (IEEE 383 qualified) electrical cable. The cable trays were arranged in an open-space horizontal configuration with the separation distances of Regulatory Guide 1.75 between those trays representing redundant safety divisions. Propane burners were used to produce a fully developed cable fire in one tray which then was allowed to interact with other trays. From this test it appears that it is possible for a fire to propagate across the vertical separation distance between safety divisions, if a fully developed cable fire is the initiating event

  19. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  20. Preliminary Test for Constitutive Models of CAP

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2010-05-15

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  1. Preliminary Test for Constitutive Models of CAP

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  2. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  3. Preliminary Design of a LSA Aircraft Using Wind Tunnel Tests

    Directory of Open Access Journals (Sweden)

    Norbert ANGI

    2015-12-01

    Full Text Available This paper presents preliminary results concerning the design and aerodynamic calculations of a light sport aircraft (LSA. These were performed for a new lightweight, low cost, low fuel consumption and long-range aircraft. The design process was based on specific software tools as Advanced Aircraft Analysis (AAA, XFlr 5 aerodynamic and dynamic stability analysis, and Catia design, according to CS-LSA requirements. The calculations were accomplished by a series of tests performed in the wind tunnel in order to assess experimentally the aerodynamic characteristics of the airplane.

  4. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2010-06-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  5. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2007-08-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  6. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    International Nuclear Information System (INIS)

    Lee C. Cadwallader

    2007-01-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with 'generic' component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance

  7. Preliminary results of hydrologic testing: The composite Umtanum basalt flow top at borehole RRL-2 (3,568 - 3,781 feet)

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A. Jr.

    1982-11-01

    This report presents preliminary results and description of hydrologic test activities for the composite Umtanum basalt flow top (3,568--3,781 feet) at Borehole RRL-2. Hydrologic tests conducted include two constant discharge air-lift and four slug tests. Preliminary results indicate an observed hydraulic head for the test interval of 405.7 feet above mean sea level. Transmissivity values determined from hydrologic tests performed, range between 244 to 481 ft 2 /day, with an assigned best estimate of 480 ft 2 /day. The best estimate of equivalent hydraulic conductivity, based on an effective test thickness of 157 feet, is 3.1 ft/day. 7 refs., 9 figs., 3 tabs

  8. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. National Data Center Preparedness Exercise 2015 (NPE 2015): MY-NDC Preliminary Analysis Result

    International Nuclear Information System (INIS)

    Faisal Izwan Abdul Rashid; Muhammed Zulfakar Zolkaffly

    2016-01-01

    Malaysia has established the CTBT National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as provide information for Treaty related events to Nuclear Malaysia as CTBT National Authority. In 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015). This paper aims at presenting MY-NDC preliminary analysis result of NPE 2015. In NPE 2015, MY-NDC has performed five different analyses, namely, radionuclide, atmospheric transport modelling (ATM), data fusion, seismic analysis and site forensics. The preliminary findings show the hypothetical scenario in NPE 2015 most probably is an uncontained event resulted high release of radionuclide to the air. (author)

  10. Preliminary analysis of accelerated space flight ionizing radiation testing

    Science.gov (United States)

    Wilson, J. W.; Stock, L. V.; Carter, D. J.; Chang, C. K.

    1982-01-01

    A preliminary analysis shows that radiation dose equivalent to 30 years in the geosynchronous environment can be accumulated in a typical composite material exposed to space for 2 years or less onboard a spacecraft orbiting from perigee of 300 km out to the peak of the inner electron belt (approximately 2750 km). Future work to determine spacecraft orbits better tailored to materials accelerated testing is indicated. It is predicted that a range of 10 to the 9th power to 10 to the 10th power rads would be accumulated in 3-6 mil thick epoxy/graphite exposed by a test spacecraft orbiting in the inner electron belt. This dose is equivalent to the accumulated dose that this material would be expected to have after 30 years in a geosynchronous orbit. It is anticipated that material specimens would be brought back to Earth after 2 years in the radiation environment so that space radiation effects on materials could be analyzed by laboratory methods.

  11. Preliminary analysis of a 1:4 scale prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Luk, V.K.; Hessheimer, M.F.

    1997-01-01

    Sandia National Laboratories is conducting a research program to investigate the integrity of nuclear containment structures. As part of the program Sandia will construct an instrumented 1:4 scale model of a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR), which will be pressure tested up to its ultimate capacity. One of the key program objectives is to develop validated methods to predict the structural performance of containment vessels when subjected to beyond design basis loadings. Analytical prediction of structural performance requires a stepwise, systematic approach that addresses all potential failure modes. The analysis effort includes two and three-dimensional nonlinear finite element analyses of the PCCV test model to evaluate its structural performance under very high internal pressurization. Such analyses have been performed using the nonlinear concrete constitutive model, ANACAP-U, in conjunction with the ABAQUS general purpose finite element code. The analysis effort is carried out in three phases: preliminary analysis; pretest prediction; and post-test data interpretation and analysis evaluation. The preliminary analysis phase serves to provide instrumentation support and identify candidate failure modes. The associated tasks include the preliminary prediction of failure pressure and probable failure locations and the development of models to be used in the detailed failure analyses. This paper describes the modeling approaches and some of the results obtained in the first phase of the analysis effort

  12. Preliminary Study of Intravenous Amantadine Treatment for Ataxia Management in Patients with Probable Multiple System Atrophy with Predominant Cerebellar Ataxia

    Directory of Open Access Journals (Sweden)

    Jinyoung Youn

    2012-05-01

    Full Text Available Background and Purpose: Multiple system atrophy with predominant cerebellar ataxia is a disabling neurologic disease. However, effective management has not yet been established. We conducted a short-term, open-label preliminary study to assess the benefits of intravenous amantadine treatment in patients with probable multiple system atrophy with predominant cerebellar ataxia. Methods: Twenty patients (10 male, 10 female with probable multiple system atrophy with predominant cerebellar ataxia received 400 mg of amantadine by intravenous per day for 5 days. Ataxia severity was evaluated by the International Cooperative Ataxia Rating Scale before and after intravenous amantadine therapy and all subjects reported subjective improvement after intravenous amantadine treatment using a patient global impression scale. We analyzed the total and subscale scores by the ataxia scale and patient global impression scale. Results: The mean age was 57.4 years (range: 47–72 and the mean disease duration was 30.8 months (range: 11–79. The ataxia severity significantly decreased after intravenous amantadine therapy from 42.5 to 37.3 (p < 0.001. The mean patient global impression scale for improvement was 2.9 and there were no side effects of intravenous amantadine treatment observed. When we assessed responders, the duration of intravenous amantadine effect was more than 1 month in 4 subjects of 7 responders. Conclusions: Our findings suggest that intravenous amantadine treatment can be a safe management option in cerebellar ataxia, although the mechanism is unclear. Thus, further double-blind, long-term studies with a larger sample size are needed.

  13. Preliminary results of hydrologic testing of the Umtanum Basalt Fracture Zone at borehole RRL-2 (3,781 to 3,827 ft)

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A. Jr.

    1983-02-01

    This report presents preliminary results and description of hydrologic test activities for the Umtanum Basalt Fracture Zone at Borehole RRL-2, within the test interval 3,781 to 3,827 feet. Hydrologic tests conducted include two short-term, constant discharge pumping tests and two slug tests. Preliminary results indicate an observed hydraulic head for the test interval of 406.7 feet above mean sea level. Transmissivity values determined from hydrologic tests performed range between 205 and 881 ft 2 /day. The best estimate of equivalent hydraulic conductivity, based on an effective test thickness of 6 feet, is 147 ft/day. 8 refs., 6 figs., 3 tabs

  14. Preliminary study on tensile properties and fractography of the recycled aluminum cast product

    International Nuclear Information System (INIS)

    Hishamuddin Hussain; Mohd Harun; Hafizal Yazid; Shaiful Rizam Shamsudin; Zaiton Selamat; Mohd Shariff Sattar

    2004-01-01

    Among many mechanical properties of materials, tensile properties are probably the most frequently considered, evaluated, and referred by the industry. This paper presents the result of preliminary study regarding the tensile properties and fractography of the recycled aluminum cast product. For this purpose, three sets of specimen were prepared for tensile testing by using permanent mold casting technique. The cast products are in durable shaped tensile specimens with the gauge length of 50mm. The tensile testing was conducted in accordance with BS EN 10002-1 and ISO 6892 standards. Fracture surface analysis was also conducted to understand materials behaviour. (Author)

  15. Exploratory shaft facility preliminary designs - Paradox Basin. Technical report

    International Nuclear Information System (INIS)

    1983-09-01

    The purpose of the Preliminary Design Report, Paradox Basin, is to provide a description of the preliminary design for an Exploratory Shaft Facility in the Paradox Basin, Utah. This issue of the report describes the preliminary design for constructing the exploratory shaft using the Large Hole Drilling Method of construction and outlines the preliminary design and estimates of probable construction cost. The Preliminary Design Report is prepared to complement and summarize other documents that comprise the design at the preliminary stage of completion, December 1982. Other design documents include drawings, cost estimates and schedules. The preliminary design drawing package, which includes the construction schedule drawing, depicts the descriptions in this report. For reference, a list of the drawing titles and corresponding numbers is included in the Appendix. The report is divided into three principal sections: Design Basis, Facility Description, and Construction Cost Estimate. 30 references

  16. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  17. Preliminary design studies on the Broad Application Test Reactor

    International Nuclear Information System (INIS)

    Terry, W.J.; Terry, W.K.; Ryskamp, J.M.; Jahshan, S.N.; Fletcher, C.D.; Moore, R.L.; Leyse, C.F.; Ottewitte, E.H.; Motloch, C.G.; Lacy, J.M.

    1992-08-01

    This report describes progress made at the Idaho National Engineering Laboratory during the first three quarters of Fiscal Year (FY) 1992 on the Laboratory-Directed Research and Development (LDRD) project to perform preliminary design studies on the Broad Application Test Reactor (BATR). This work builds on the FY-92 BATR studies, which identified anticipated mission and safety requirements for BATR and assessed a variety of reactor concepts for their potential capability to meet those requirements. The main accomplishment of the FY-92 BATR program is the development of baseline reactor configurations for the two conventional conceptual test reactors recommended in the FY-91 report. Much of the present report consists of descriptions and neutronics and thermohydraulics analyses of these baseline configurations. In addition, we considered reactor safety issues, compared the consequences of steam explosions for alternative conventional fuel types, explored a Molten Chloride Fast Reactor concept as an alternate BATR design, and examined strategies for the reduction of operating costs. Work planned for the last quarter of FY-92 is discussed, and recommendations for future work are also presented

  18. Preliminary study fo the interference of proteic compounds of radiopharmaceuticals in the test of lisadode amebocitos de limulus (LAL)

    International Nuclear Information System (INIS)

    Aldana, Claudia

    1997-01-01

    In this thesis the objective was evaluate the interference of proteic compounds of the radiopharmaceuticals in the test LAL (lisado of amebocitos de limulus) for this, macroagregates of albumina (MAA) was used with metilendifosfonato (MDP) as control that is the radiopharmaceutical more used in the nuclear medicine centers of the country. Initially preliminary test were carried out to assess if some of two radiopharmaceuticals would cause interference with LAL test, after the test was validated and finally routine tests were made. With the preliminary assays was concluded that proteic compounds did not cause interference (albumina with a concentration of 2 md/dl) with the MAA. However with the MDP cause interference with LAL test. The interference was eliminated with a dilution of 1:8 of the sample. Was concluded that the success of LAL test depends on conditions such as temperature, pH, constant incubation (no minimum variations) and that is a good test for quality control of the radiopharmaceuticals

  19. Mineralogical test as a preliminary step for metallurgical proses of Kalan ores

    International Nuclear Information System (INIS)

    Affandi, K.

    1998-01-01

    Mineralogical tests as a preliminary step for hydrometallurgy of Kalan ores, including Eko Remaja and Rirang have been carried out to identify the elements and minerals content which affect the metallurgical process, especially the leaching and purification of uranium. Mineralogical tests have been done by means of radioactive and radioluxugraph tests to identify radioactive minerals; thin specimen analysis, Scanning Electron Microscopy (SEM) to identify elements and morphology, EPMA to analyse qualitatively the elements, X-ray Diffractometer (XRD) to identify of minerals content; and X-ray Fluorescence (XRF) and chemical analyses to determine total elements qualitatively and quantitatively. The experimental results show that the Eko Remaja ores contain uraninite and brannerite, iron and titan oxides, sulfides, phosphates and silicates minerals, while the Rirang ores contain uraninite, monazite and molybdenite

  20. Exploratory shaft facility preliminary designs - Gulf Interior Region salt domes

    International Nuclear Information System (INIS)

    1983-09-01

    The purpose of the Preliminary Design Report, Gulf Interior Region, is to provide a description of the preliminary design for an Exploratory Shaft Facility on the Richton Dome, Mississippi. This issue of the report describes the preliminary design for constructing the exploratory shaft using the Large Hole Drilling method of construction and outlines the preliminary design and estimates of probable construction cost. The Preliminary Design Report is prepared to complement and summarize other documents that comprise the design at the preliminary stage of completion, December 1982. Other design documents include drawings, cost estimates and schedules. The preliminary design drawing package, which includes the construction schedule drawing, depicts the descriptions in this report. For reference, a list of the drawing titles and corresponding numbers are included in the Appendix. The report is divided into three principal sections: Design Basis, Facility Description and Construction Cost Estimate

  1. Do preliminary chest X-ray findings define the optimum role of pulmonary scintigraphy in suspected pulmonary embolism?

    International Nuclear Information System (INIS)

    Forbes, Kirsten P.N.; Reid, John H.; Murchison, John T.

    2001-01-01

    AIM: To investigate if preliminary chest radiograph (CXR) findings can define the optimum role of lung scintigraphy in subjects investigated for pulmonary embolism (PE). MATERIALS AND METHODS: The CXR and scintigraphy findings from 613 consecutive subjects investigated for suspected PE were retrieved from a radiological database. Of 393 patients with abnormal CXRs, a subgroup of 238 was examined and individual radiographic abnormalities were characterized. CXR findings were related to the scintigraphy result. RESULTS: Scintigraphy was normal in 286 subjects (47%), non-diagnostic in 207 (34%) and high probability for PE in 120 (20%). In 393 subjects (64%) the preliminary CXR was abnormal and 188 (48%) of scintigrams in this group were non-diagnostic. Individual radiographic abnormalities were not associated with significantly different scintigraphic outcomes. If the preliminary CXR was normal (36%), the proportion of non-diagnostic scintigrams decreased to 9% (19 of 220 subjects) (P < 0.05). CONCLUSION: In subjects investigated for PE, an abnormal CXR increases the prevalence of non-diagnostic scintigrams. A normal pre-test CXR is more often associated with a definitive (normal or high probability) scintigram result. The chest radiograph may be useful in deciding the optimum sequence of investigations. Forbes, K.P.N., Reid, J.H., Murchison, J.T. (2001)

  2. Preliminary testing of a planar converter with uranium oxide pellets in the emitter

    International Nuclear Information System (INIS)

    Miskolczy, G.; Lieb, D.P.; Hatch, G.L.

    1992-01-01

    Nuclear reactor thermionic space power systems incorporating thermionic fuel element generally use refractory metal emitters, which contain the nuclear fuel. The purpose of the current work is to determine the effect, if any, of the diffusion of uranium oxide fuel through chemically vapor deposited (CVD) tungsten on converter performance. This paper describes the preliminary testing of the converter to assess the converter performance before any significant diffusion takes place. In testing, the emitter temperature was 1800 K and the collector temperature was varied from 1000 K to 1070 K. Experiments also examined pressure versus loading characteristics of the graphite

  3. Preliminary test results from the HSST shallow-crack fracture toughness program

    International Nuclear Information System (INIS)

    Theiss, T.J.; Robinson, G.C.; Rolfe, S.T.

    1991-01-01

    The Heavy Section Steel Technology (HSST) Program under sponsorship of the Nuclear Regulatory Commission (NRC) is investigating the influence of crack depth on the fracture toughness of reactor pressure vessel steel. The ultimate goal of the investigation is the generation of a limited data base of elastic-plastic fracture toughness values appropriate for shallow flaws in a reactor pressure vessel and the application of this data to reactor vessel life assessments. It has been shown that shallow-flaws play a dominant role in the probabilistic fracture mechanics analysis of reactor pressure vessels during a pressurized-thermal-shock event. In addition, recent research has shown that the crack initiation toughness measured using specimens with shallow flaws is greater that the toughness determined with conventional, deeply notched specimens at temperatures within the transition region for non-nuclear steels. The influence of crack depth on the elastic-plastic fracture toughness for prototypic reactor material is being investigated. Preliminary results indicate a significant increase in the toughness associated with shallow-flaws which has the potential to significantly impact the conditional probability of vessel failure. 8 refs., 4 figs., 1 tab

  4. Frequency formats, probability formats, or problem structure? A test of the nested-sets hypothesis in an extensional reasoning task

    Directory of Open Access Journals (Sweden)

    William P. Neace

    2008-02-01

    Full Text Available Five experiments addressed a controversy in the probability judgment literature that centers on the efficacy of framing probabilities as frequencies. The natural frequency view predicts that frequency formats attenuate errors, while the nested-sets view predicts that highlighting the set-subset structure of the problem reduces error, regardless of problem format. This study tested these predictions using a conjunction task. Previous studies reporting that frequency formats reduced conjunction errors confounded reference class with problem format. After controlling this confound, the present study's findings show that conjunction errors can be reduced using either a probability or a frequency format, that frequency effects depend upon the presence of a reference class, and that frequency formats do not promote better statistical reasoning than probability formats.

  5. Results of Preliminary Tests of PAR Bunch Cleaning

    CERN Document Server

    Yao, Chihyuan; Grelick, Arthur; Lumpkin, Alex H; Sereno, Nicholas S

    2005-01-01

    A particle accumulator ring (PAR) is used at the Advanced Photon Source (APS) to collect multiple linac bunches and compress them into a 0.3-ns (rms) single bunch for booster injection. A 9.77-MHz fundamental rf system and a 117.3-MHz harmonic rf system are employed for initial beam capture and bunch length compression. Satellite bunches with very low charge form due to rf phase drifts or beam loading change. These satellites, when injected into the booster and then into the storage ring (SR), cause bunch impurity at three buckets from the target bucket. Storage ring and booster bunch cleaning was tried but proved to be difficult due to the top-up mode of operation in the storage ring and tune drift in the booster synchrotron. Recently we implemented a PAR bunch-cleaning system with tune-modulated harmonic rf knockout. Preliminary tests gave a measured SR bunch purity of better than 10

  6. Static Aeroelastic Deformation Effects in Preliminary Wind-tunnel Tests of Silent Supersonic Technology Demonstrator

    OpenAIRE

    Makino, Yoshikazu; Ohira, Keisuke; Makimoto, Takuya; Mitomo, Toshiteru; 牧野, 好和; 大平, 啓介; 牧本, 卓也; 三友, 俊輝

    2011-01-01

    Effects of static aeroelastic deformation of a wind-tunnel test model on the aerodynamic characteristics are discussed in wind-tunnel tests in the preliminary design phase of the silent supersonic technology demonstrator (S3TD). The static aeroelastic deformation of the main wing is estimated for JAXA 2m x 2m transonic wind-tunnel and 1m x 1m supersonic wind-tunnel by a finite element method (FEM) structural analysis in which its structural model is tuned with the model deformation calibratio...

  7. Persian competing word test: Development and preliminary results in normal children

    Directory of Open Access Journals (Sweden)

    Mohammad Ebrahim Mahdavi

    2008-12-01

    Full Text Available Background and Aim: Assessment of central auditory processing skills needs various behavioral tests in format of a test battery. There is a few Persian speech tests for documenting central auditory processing disorders. The purpose of this study was developing a dichotic test formed of one-syllabic words suitable for evaluation of central auditory processing in Persian language children and reporting its preliminary results in a group of normal children.Materials and Methods: Persian words in competing manner test was developed utilizing most frequent monosyllabic words in children storybooks reported in the previous researches. The test was performed at MCL on forty-five normal children (39 right-handed and 6 left-handed aged 5-11 years. The children did not show any obvious problem in hearing, speech, language and learning. Free (n=28 and directed listening (n=17 tasks were investigated.Results: The results show that in directed listening task, there is significant advantage for performance of pre-cued ear relative to opposite side. Right ear advantage is evident in free recall condition. Average performance of the children in directed recall is significantly better than free recall. Average row score of the test increases with the children age.Conclusion: Persian words in competing manner test as a dichotic test, can show major characteristics of dichotic listening and effect of maturation of central auditory system on it in normal children.

  8. Design and preliminary testing of a MEMS microphone phased array for aeroacoustic testing of a small-scale wind turbine airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Bale, A.; Orlando, S.; Johnson, D. [Waterloo Univ., ON (Canada). Wind Energy Group

    2010-07-01

    One of the barriers preventing the widespread utilization of wind turbines is the audible sound that they produce. Developing quieter wind turbines will increase the amount of available land onto which wind farms can be built. Noise emissions from wind turbines can be attributed to the aerodynamic effects between the turbine blades and the air surrounding them. A dominant source of these aeroacoustic emissions from wind turbines is known to originate at the trailing edges of the airfoils. This study investigated the flow physics of noise generation in an effort to reduce noise from small-scale wind turbine airfoils. The trailing edge noise was studied on scale-models in wind tunnels and applied to full scale conditions. Microphone phased arrays are popular research tools in wind tunnel aeroacoustic studies because they can measure and locate noise sources. However, large arrays of microphones can be prohibitively expensive. This paper presented preliminary testing of micro-electrical mechanical system (MEMS) microphones in phased arrays for aeroacoustic testing on a small wind turbine airfoil. Preliminary results showed that MEMS microphones are an acceptable low-cost alternative to costly condenser microphones. 19 refs., 1 tab., 11 figs.

  9. Preliminary Feasibility, Design, and Hazard Analysis of a Boiling Water Test Loop Within the Idaho National Laboratory Advanced Test Reactor National Scientific User Facility

    International Nuclear Information System (INIS)

    Gerstner, Douglas M.

    2009-01-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The ATR and its support facilities are located at the Idaho National Laboratory (INL). A Boiling Water Test Loop (BWTL) is being designed for one of the irradiation test positions within the. The objective of the new loop will be to simulate boiling water reactor (BWR) conditions to support clad corrosion and related reactor material testing. Further it will accommodate power ramping tests of candidate high burn-up fuels and fuel pins/rods for the commercial BWR utilities. The BWTL will be much like the pressurized water loops already in service in 5 of the 9 'flux traps' (region of enhanced neutron flux) in the ATR. The loop coolant will be isolated from the primary coolant system so that the loop's temperature, pressure, flow rate, and water chemistry can be independently controlled. This paper presents the proposed general design of the in-core and auxiliary BWTL systems; the preliminary results of the neutronics and thermal hydraulics analyses; and the preliminary hazard analysis for safe normal and transient BWTL and ATR operation

  10. Preliminary Test of Friction disk type turbine for S-CO{sub 2} cycle application

    Energy Technology Data Exchange (ETDEWEB)

    Baik, Seungjoon; Kim, Hyeon Tae; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    Due to the relatively mild sodium-CO{sub 2} interaction, the S-CO{sub 2} Brayton cycle can reduce the accident consequence compared to the steam Rankine cycle. Also the S-CO{sub 2} power conversion cycle can achieve high efficiency for SFR core thermal condition. Moreover, the S-CO{sub 2} power cycle can reduce the total cycle footprint due to high density of the working fluid. However, the high pressure operating condition and low viscosity of the fluid cause difficulties in designing appropriate seals and multi-stage turbo machineries. To solve the problem for designing turbo machineries in a creative way, KAIST research team tested a friction disk type turbine concept for the S-CO{sub 2} cycle application. In this paper, the investigation of the Tesla turbine and preliminary test results with compressed air are covered. The KAIST research team investigated a friction disk type turbine, named as Tesla turbine, for the S-CO{sub 2} power cycle applications. Due to the robust design of the fiction disk type, the Tesla turbine technology can be utilized not only for S-CO{sub 2} turbo machinery but also for the multi-phase or sludge flow turbo machinery. The preliminary test of lab-scale Tesla turbine with compressed air was conducted. The high pressure vessel was manufactured for the S-CO{sub 2} operating condition. The test will be concentrated on the turbine efficiency measurement under various conditions and development of the design methodology.

  11. Standing the Test of Time: Reference for a Preliminary Ruling

    DEFF Research Database (Denmark)

    Butler, Graham

    2017-01-01

    It is often too easy to forget just how important the preliminary reference procedure is for the functionality of European Union law. For the Court of Justice, there are both formal and informal means of judicial dialogue. This article focuses on the formal means of dialogue through the preliminary...

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Preliminary results on tests of a Cerenkov ring imaging device employing a photoionizing PWC

    Energy Technology Data Exchange (ETDEWEB)

    Durkin, S.; Honma, A.; Leith, D.W.G.S.

    1978-08-01

    A brief description of techniques and problems of ring imaging Cerenkov detectors employing photoionizing PWC's is discussed. Preliminary results on a one dimensional ring imaging device tested at SLAC in May and June of 1978 are then presented. These results include rough measurements of the Cerenkov ring in nitrogen, argon, neon, and helium produced by a collimated positron beam.

  14. The development and preliminary testing of a multimedia patient–provider survivorship communication module for breast cancer survivors

    Science.gov (United States)

    Wen, Kuang-Yi; Miller, Suzanne M.; Stanton, Annette L.; Fleisher, Linda; Morra, Marion E.; Jorge, Alexandra; Diefenbach, Michael A.; Ropka, Mary E.; Marcus, Alfred C.

    2012-01-01

    Objective This paper describes the development of a theory-guided and evidence-based multimedia training module to facilitate breast cancer survivors’ preparedness for effective communication with their health care providers after active treatment. Methods The iterative developmental process used included: (1) theory and evidence-based content development and vetting; (2) user testing; (3) usability testing; and (4) participant module utilization. Results Formative evaluation of the training module prototype occurred through user testing (n = 12), resulting in modification of the content and layout. Usability testing (n = 10) was employed to improve module functionality. Preliminary web usage data (n = 256, mean age = 53, 94.5% White, 75% college graduate and above) showed that 59% of the participants accessed the communication module, for an average of 7 min per login. Conclusion The iterative developmental process was informative in enhancing the relevance of the communication module. Preliminary web usage results demonstrate the potential feasibility of such a program. Practice implications Our study demonstrates survivors’ openness to the use of a web-based communication skills training module and outlines a systematic iterative user and interface program development and testing process, which can serve as a prototype for others considering such an approach. PMID:22770812

  15. The development and preliminary testing of a multimedia patient-provider survivorship communication module for breast cancer survivors.

    Science.gov (United States)

    Wen, Kuang-Yi; Miller, Suzanne M; Stanton, Annette L; Fleisher, Linda; Morra, Marion E; Jorge, Alexandra; Diefenbach, Michael A; Ropka, Mary E; Marcus, Alfred C

    2012-08-01

    This paper describes the development of a theory-guided and evidence-based multimedia training module to facilitate breast cancer survivors' preparedness for effective communication with their health care providers after active treatment. The iterative developmental process used included: (1) theory and evidence-based content development and vetting; (2) user testing; (3) usability testing; and (4) participant module utilization. Formative evaluation of the training module prototype occurred through user testing (n = 12), resulting in modification of the content and layout. Usability testing (n = 10) was employed to improve module functionality. Preliminary web usage data (n = 256, mean age = 53, 94.5% White, 75% college graduate and above) showed that 59% of the participants accessed the communication module, for an average of 7 min per login. The iterative developmental process was informative in enhancing the relevance of the communication module. Preliminary web usage results demonstrate the potential feasibility of such a program. Our study demonstrates survivors' openness to the use of a web-based communication skills training module and outlines a systematic iterative user and interface program development and testing process, which can serve as a prototype for others considering such an approach. Copyright © 2012. Published by Elsevier Ireland Ltd.

  16. Preliminary scenarios for the release of radioactive waste from a hypothetical repository in basalt of the Columbia Plateau

    International Nuclear Information System (INIS)

    Hunter, R.L.

    1983-10-01

    Nine release phenomena - normal flow of water, tectonic disturbance of the fracture network, intersection of a new fault with the repository, glaciation, fluvia erosion, thermomechanical disturbances, subsidence, seal failure, and drilling - give rise to 318 preliminary scenarios for the release of waste from a hypothetical high-level-waste repository in basalt. The scenarios have relative probabilities that range over several orders of magnitude. The relative probabilities provide a means of screening the scenarios for the more limited set to be subjected to consequence analysis. Lack of data and of preliminary modeling, however, lead to great uncertainties in the highly conservative probabilities assigned here. As a result, the recommendations in this report are directed at resolution of the major uncertainties in the relative probabilities of the preliminary scenarios. The resolution of some of the uncertainties should help in the selection of the suite of scenarios for final consequence analysis. 38 references, 22 figures, 18 tables

  17. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  18. Characteristics, finite element analysis, test description, and preliminary test results of the STM4-120 kinematic Stirling engine

    Science.gov (United States)

    Linker, K. L.; Rawlinson, K. S.; Smith, G.

    1991-10-01

    The Department of Energy's Solar Thermal Program has, as one of its program elements, the development and evaluation of conversion device technologies applicable to dish-electric systems. The primary research and development combines a conversion device (heat engine), solar receiver, and generator mounted at the focus of a parabolic dish concentrator. The Stirling-cycle heat engine was identified as the conversion device for dish-electric with the most potential for meeting the program's goals for efficiency, reliability, and installed cost. To advance the technology toward commercialization, Sandia National Laboratories has acquired a Stirling Thermal Motors, Inc. kinematic Stirling engine, STM4-120, for evaluation. The engine is being bench-tested at Sandia's Engine Test Facility and will be combined later with a solar receiver for on-sun evaluation. This report presents the engine characteristics, finite element analyses of critical engine components, test system layout, instrumentation, and preliminary performance results from the bench test.

  19. Manufacturing and preliminary tests of a 12 T ''wind and react'' coil

    International Nuclear Information System (INIS)

    Corte, A. della; Pasotti, G.; Sacchetti, N.; Spadoni, M.; Oliva, A.B.; Penco, R.; Parodi, S.; Valle, N.; Specking, W.

    1994-01-01

    As already reported ENEA is engaged in the realization of a 12 T wind and react Nb 3 Sn coil, a subsize magnet designed to simulate many technological problems to be faced in NET-ITER magnets. EM-LMI and Ansaldo are the industrial partners in this project. A preliminary winding has been built and successfully tested. This winding has been cut in pieces and carefully inspected to be sure that the impregnation process after the heat treatment works well. No particular flaws have been detected. Then manufacturing of the 12 T magnet has been started and completed in about three months. Heat treatment, impregnation and electrical tests at 300 K have been successfully performed and the magnet is now ready for final tests. In order to obtain the most significant scientific and technological information from this magnet, the original test program (insertion of the coil in the SULTAN facility) has been modified according to a decision of the Fusion Technology Steering Committee (FTSC) of EURATOM. Details of the new test programs are given in the paper

  20. Preliminary Process Theory does not validate the Comparison Question Test: A comment on Palmatier and Rovner

    NARCIS (Netherlands)

    Ben-Shakar, G.; Gamer, M.; Iacono, W.; Meijer, E.; Verschuere, B.

    2015-01-01

    Palmatier and Rovner (2015) attempt to establish the construct validity of the Comparison Question Test (CQT) by citing extensive research ranging from modern neuroscience to memory and psychophysiology. In this comment we argue that merely citing studies on the preliminary process theory (PPT) of

  1. Stress analysis of HLW containers. Preliminary ring test exercise Compas project

    International Nuclear Information System (INIS)

    1989-01-01

    This document describes the series of experiments and associated calculations performed as the Compas preliminary ring test exercise. A number of mild steel rings, representative of sections through HLW containers, some notched and pre-cracked, were tested in compression right up to and beyond their ultimate load. The Compas project partners independently modelled the behaviour of these rings using their finite element codes. Four different ring types were tested, and each test was repeated three times. For three of the ring types, the three test repetitions gave identical results. The fourth ring, which was not modelled by the partners, had a 4 mm thick layer of weld metal deposited on its surface. The three tests on this ring did not give identical results and suggested that the effect of welding methods should be addressed at a later stage of the project. Fracture was not found to be a significant cause of ring failure. The results of the ring tests were compared with the partners predictions, and additionally some time was spent assessing where the use of the codes could be improved. This exercise showed that the partners codes have the ability to produce results within acceptable limits. Most codes were unable to model stable crack growth. There were indications that some codes would not be able to cope with a significantly more complex three-dimensional analysis

  2. Probability density fittings of corrosion test-data: Implications on ...

    Indian Academy of Sciences (India)

    Steel-reinforced concrete; probability distribution functions; corrosion ... to be present in the corrosive system at a suitable concentration (Holoway et al 2004; Söylev & ..... voltage, equivalent to voltage drop, across a resistor divided by the ...

  3. Preliminary Tests Of The Decris-sc Ion Source

    CERN Document Server

    Efremov, A; Bechterev, V; Bogomolov, S L; Bondarenko, P G; Datskov, V I; Dmitriev, S; Drobin, V; Lebedev, A; Leporis, M; Malinowski, H; Nikiforov, A; Paschenko, S V; Seleznev, V; Shishov, Yu A; Smirnov, Yu; Tsvineva, G; Yakovlev, B; Yazvitsky, N Yu

    2004-01-01

    A new "liquid He-free" superconducting Electron Cyclotron Resonance Ion Source DECRIS-SC, to be used as injector for the IC-100 small cyclotron, has been designed by FLNR and LHE JINR. The main feature is that a compact refrigerator of Gifford-McMahon type is used to cool the solenoid coils. For the reason of very small cooling power at 4.2 K (about 1 W) our efforts were to optimize the magnetic structure and minimize an external heating of the coils. The maximum magnetic field strength is 3 T and 2 T in injection and extraction region respectively. For the radial plasma confinement a hexapole made of NdFeB permanent magnet is used. The source will be capable of ECR plasma heating using different frequencies (14 GHz or 18 GHz). To be able to deliver usable intensities of solids, the design is also allow axial access for evaporation oven and metal samples using the plasma sputtering technique. Very preliminary results of the source test are presented.

  4. Preliminary Single-Phase Mixing Test using Wire Mesh System in a wire-wrapped 37-rod Bundle

    International Nuclear Information System (INIS)

    Bae, Hwang; Kim, Hyungmo; Lee, Dong Won; Choi, Hae Seob; Choi, Sun Rock; Chang, Seokkyu; Kim, Seok; Euh, Dongjin; Lee, Hyeongyeon

    2014-01-01

    In this paper, preliminary tests of the wire-mesh sensor are introduced before measuring of mixing coefficient in the wire-wrapped 37-pin fuel assembly for a sodium-cooled fast reactor. Through this preliminary test, it was confirmed that city water can be used as a tracer for demineralized water as a base. A simple test was performed to evaluate the characteristics of a wire mesh with of a short pipe shape. The conductivity of de-mineralized water and city water is linearly increased for the limited temperature ranges as the temperature is increased. The reliability of the wire mesh sensor was estimated based on the averages and standard deviations of the plane image using the cross points. A wire mesh sensor is suitable to apply to a single-phase flow measurement for a mixture with de-mineralized water and city water. A wire mesh sensor and system have been traditionally used to measure the void fraction of a two-phase flow field with gas and liquid. Recently, Ylonen et al. successfully designed and commissioned a measurement system for a single-phase flow using a wire mesh sensor

  5. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. In vitro preliminary cytotoxicity testing of vegetal extracts, using colorimetric methods

    Directory of Open Access Journals (Sweden)

    Claudia Patricia Cordero Camacho

    2002-01-01

    Full Text Available To advance in the study of the Colombian vegetal biodiversity, considered as a potential source of pharmacologically active products, the establishment of biological activity evaluation systems is necessary, which allow the detection of active products against pathologies with high social and economical impact, such as cancer. This work describes the implementation of a preliminary in vitro methodology for the determination of potential anticancer activity in vegetal extracts, by cytotoxicity testing upon human tumor cell lines, measuring the cellular mass indirectly with the colorimetric assays of MTT (methyl tetrazolium tiazole reduction and SRB (sulforhodamine Bstaining. HT-29, MCF-7, SiHa and HEp-2 cell lines cultures were adapted, MTT concentration, cellular density and treatment period parameters for the cytotoxicity assay were selected. Cell lines sensitivity to the chemotherapeutic agent Doxorubicin HCl was determined. Colombian vegetal species extracts cytotoxicity was tested and usefulness of the assay as a tool to bioguide the search of active products was evidenced.

  7. In vitro preliminary cytotoxicity testing of vegetal extracts, using colorimetric methods

    Directory of Open Access Journals (Sweden)

    Claudia Patricia Cordero Camacho

    2011-12-01

    Full Text Available To advance in the study of the Colombian vegetal biodiversity, considered as a potential source of pharmacologically active products, the establishment of biological activity evaluation systems is necessary, which allow the detection of active products against pathologies with high social and economical impact, such as cancer. This work describes the implementation of a preliminary in vitro methodology for the determination of potential anticancer activity in vegetal extracts, by cytotoxicity testing upon human tumor cell lines, measuring the cellular mass indirectly with the colorimetric assays of MTT (methyl tetrazolium tiazole reduction and SRB (sulforhodamine Bstaining. HT-29, MCF-7, SiHa and HEp-2 cell lines cultures were adapted, MTT concentration, cellular density and treatment period parameters for the cytotoxicity assay were selected. Cell lines sensitivity to the chemotherapeutic agent Doxorubicin HCl was determined. Colombian vegetal species extracts cytotoxicity was tested and usefulness of the assay as a tool to bioguide the search of active products was evidenced.

  8. Preliminary performance test of control rod position indicator for ballscrew type CEDM

    International Nuclear Information System (INIS)

    Yoo, J. Y.; Kim, J. H.; Hu, H.; Lee, J. S.; Kim, J. I.

    2003-01-01

    The reliability and accuracy of the information on control rod position are very important to the reactor safety and the design of the core protection system. A survey on the RSPT(Reed Switch Position Transmitter) type control rod position indication system and its actual implementation in the exiting nuclear power plants in Korea was performed first. The prototype of control rod position indicator having the high performance for the ballscrew type CEDM was developed on the basis of RSPT technology identified through the survey. The characteristics of control rod position indicator was defined and documented through design procedure and preliminary performance test

  9. Reflooding Experimental On Beta Test Loop : The Characterisation And Preliminary Experiment

    International Nuclear Information System (INIS)

    Khairul, H.; Antariksawan, Anhar R.; Sumamo, Edy; Kiswanta; Giarno; Joko, P.; H, Ismu

    2001-01-01

    The characterisation and preliminary experiment of reflooding had been conducted. The characteristics of main system and component had been identified completely. From these characteristics the experiment condition was set up : heated rod voltage was 20 volt, frequency,of pump was 19 Hz, flow rate was 1 m3/h. The first of experiment did not show the phenomena of rewetting. Possibly because the heated rod temperature was too low. For the second experiment, the voltage of heated rod was increased to 22 Volt and the flow rate was decreased. The result was that the nucleation boiling on the surfaced of heated rod, was observed during the water re flooded the test section

  10. Evaluation of high pressure Freon decontamination. I. Preliminary tests

    International Nuclear Information System (INIS)

    Rankin, W.N.

    1983-01-01

    High-pressure Freon blasting techniques are being evaluated for applications involving the removal of non-adherent radioactive particulate contamination at SRP. Very little waste is generated by this technique because the used Freon can be easily distilled and reused. One of the principle advantages of this technique is that decontaminated electrical equipment can be returned to service immediately without drying, unlike high-pressure water blasting techniques. Preliminary scoutin tests evaluating high-pressure Freon blasting for decontamination at SRP were carried out at Quadrex Co., Oak Ridge, TN, October 12 and 13. DWPF-type contamination (raw sludge plus volatiles) and separations area-type contamination (diluted boiling point (47.6 0 C) allow it to rapidly separate from higher boiling contaminants via distillation with filtration to remove particulate material, and distillation with condensation, the solvent may be recovered for indefinite reuse while reducing the radioactive waste to a minimum. 3 references, 5 figures, 6 tables

  11. Preliminary results for HIP bonding Ta to W targets for the materials test station

    Energy Technology Data Exchange (ETDEWEB)

    Dombrowski, David E [Los Alamos National Laboratory; Maloy, Stuart A [Los Alamos National Laboratory

    2009-01-01

    Tungsten targets for the Materials Test Station (MTS) were clad with thin tantalum cover plates and a tantalum frame using hot isostatic pressing (HIP). A preliminary HIP parameter study showed good bonding and intimate mechanical contact for Ta cover plate thicknesses of 0.25 mm (0.010 inch) and 0.38 mm (0.015 inch). HIP temperatures of full HIP runs were 1500 C (2732 F). HIP pressure was 203 MPa (30 ksi).

  12. Preliminary tests of the electrostatic plasma accelerator

    Science.gov (United States)

    Aston, G.; Acker, T.

    1990-01-01

    This report describes the results of a program to verify an electrostatic plasma acceleration concept and to identify those parameters most important in optimizing an Electrostatic Plasma Accelerator (EPA) thruster based upon this thrust mechanism. Preliminary performance measurements of thrust, specific impulse and efficiency were obtained using a unique plasma exhaust momentum probe. Reliable EPA thruster operation was achieved using one power supply.

  13. Current status of VEGA program and a preliminary test with cesium iodide

    International Nuclear Information System (INIS)

    Hidaka, A.; Nakamura, T.; Kudo, T.; Hayashida, R.; Nakamura, J.; Otomo, T.; Uetsuka, H.

    2000-01-01

    The VEGA program has been performed at JAERI to clarify the mechanism of FP release from irradiated PWR/BWR fuels including MOX fuel and to improve predictability of the source term. The principal purposes are to investigate the release of actinides and FPs including non-volatile radionuclides from irradiated fuel at 3000degC under high pressure condition up to 1.0 MPa. The short-life radionuclides will be accumulated by re-irradiation of test fuel just before the experiment using the JAERI's research reactor such as JRR-3 or NSRR. The test facility was installed into the beta/gamma concrete No.5 cell at RFEF and completed in February, 1999. Before the first VEGA-1 test in September, 1999, a preliminary test using a cold simulant, cesium iodide (CsI) was performed to confirm the fundamental capabilities of the test facility. The test results showed that the trapping efficiency of the aerosol filters is about 98%. The amount of CsI which arrived at the downstream pipe of the filters was quite small while a small amount of I 2 gas which can pass through the filters was condensed just before the cold condenser as expected in the design. (author)

  14. ERG and GRG review of the draft of ''preliminary test plan for in situ testing from an exploratory shaft in salt - October 1983''

    International Nuclear Information System (INIS)

    Kalia, H.N.

    1986-03-01

    The Engineering Review Group (ERG) and Geologic Review Group (GRG) were established by the Office of Nuclear Waste Isolation (ONWI) to help evaluate engineering- and geologic-related issues in the US Department of Energy's nuclear waste repository program. The January 1984 meeting of the ERG and GRG reviewed the In Situ Test Plan (ISTP) titled ''Preliminary Test Plan for In Situ Testing From an Exploratory Shaft in Salt - October 1983.'' This report documents the ERG's and GRG's comments and recommendations on this subject and the ONWI responses to the specific points raised by the ERG and GRG. 6 refs., 2 figs., 1 tab

  15. Preliminary definition of the remote handling system for the current IFMIF Test Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Queral, V., E-mail: vicentemanuel.queral@ciemat.es [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain); Urbon, J. [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid, 28006 Madrid (Spain); Garcia, A.; Cuarental, I.; Mota, F. [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain); Micciche, G. [CR ENEA Brasimone, I-40035 Camugnano (BO) (Italy); Ibarra, A. [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain); Rapisarda, D. [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid, 28006 Madrid (Spain); Casal, N. [Laboratorio Nacional de Fusion, EURATOM-CIEMAT, 28040 Madrid (Spain)

    2011-10-15

    A coherent design of the remote handling system with the design of the components to be manipulated is vital for reliable, safe and fast maintenance, having a decisive impact on availability, occupational exposures and operational cost of the facility. Highly activated components in the IFMIF facility are found at the Test Cell, a shielded pit where the samples are accurately located. The remote handling system for the Test Cell reference design was outlined in some past IFMIF studies. Currently a new preliminary design of the Test Cell in the IFMIF facility is being developed, introducing important modifications with respect to the reference one. This recent design separates the previous Vertical Test Assemblies in three functional components: Test Modules, shielding plugs and conduits. Therefore, it is necessary to adapt the previous design of the remote handling system to the new maintenance procedures and requirements. This paper summarises such modifications of the remote handling system, in particular the assessment of the feasibility of a modified commercial multirope crane for the handling of the weighty shielding plugs for the new Test Cell and a quasi-commercial grapple for the handling of the new Test Modules.

  16. Preliminary definition of the remote handling system for the current IFMIF Test Facilities

    International Nuclear Information System (INIS)

    Queral, V.; Urbon, J.; Garcia, A.; Cuarental, I.; Mota, F.; Micciche, G.; Ibarra, A.; Rapisarda, D.; Casal, N.

    2011-01-01

    A coherent design of the remote handling system with the design of the components to be manipulated is vital for reliable, safe and fast maintenance, having a decisive impact on availability, occupational exposures and operational cost of the facility. Highly activated components in the IFMIF facility are found at the Test Cell, a shielded pit where the samples are accurately located. The remote handling system for the Test Cell reference design was outlined in some past IFMIF studies. Currently a new preliminary design of the Test Cell in the IFMIF facility is being developed, introducing important modifications with respect to the reference one. This recent design separates the previous Vertical Test Assemblies in three functional components: Test Modules, shielding plugs and conduits. Therefore, it is necessary to adapt the previous design of the remote handling system to the new maintenance procedures and requirements. This paper summarises such modifications of the remote handling system, in particular the assessment of the feasibility of a modified commercial multirope crane for the handling of the weighty shielding plugs for the new Test Cell and a quasi-commercial grapple for the handling of the new Test Modules.

  17. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  18. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  19. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  20. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  1. Preliminary Design Progress of the HCCR TBM for ITER testing

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won; Park, Sung Dae; Kim, Dong Jun; Jin, Hyung Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Korea has designed a helium cooled ceramic reflector (HCCR) test blanket module (TBM) including the TBM-shield, which is called the TBM-set, to be tested in ITER, a Nuclear Facility INB-174. Through the conceptual design review (CDR), its design integrity was successfully demonstrated at the conceptual design level at various loads. After CD approval, preliminary design (PD) was started and the progress is introduced in the present study. After PD review and approval, final design and then fabrication will be started. The main purpose of PD is to design the TBM-set according to the fabrication aspect and more detailed design for interfaces with ITER machine, such as installed TBM port plug and frame. With these considering, PD of TBM-set was started. PD for HCCR TBM has been performed (so far v0.24) from the CD model. FW, BZ, SW, TES/NAS, BM, and connecting support design were performed through the analyses, if necessary. The manufacturability was the main concern for PD model development. Thermal hydraulic analysis will be performed to evaluate the temperature and pressure drop in TBM-set. The structural integrity of TBM-set will be confirmed with combined various loads condition.

  2. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  3. Conditional non-independence of radiographic image features and the derivation of post-test probabilities – A mammography BI-RADS example

    International Nuclear Information System (INIS)

    Benndorf, Matthias

    2012-01-01

    Bayes' theorem has proven to be one of the cornerstones in medical decision making. It allows for the derivation of post-test probabilities, which in case of a positive test result become positive predictive values. If several test results are observed successively Bayes' theorem may be used with assumed conditional independence of test results or with incorporated conditional dependencies. Herein it is examined whether radiographic image features should be considered conditionally independent diagnostic tests when post-test probabilities are to be derived. For this purpose the mammographic mass dataset from the UCI (University of California, Irvine) machine learning repository is analysed. It comprises the description of 961 (516 benign, 445 malignant) mammographic mass lesions according to the BI-RADS (Breast Imaging: Reporting and Data System) lexicon. Firstly, an exhaustive correlation matrix is presented for mammography BI-RADS features among benign and malignant lesions separately; correlation can be regarded as measure for conditional dependence. Secondly, it is shown that the derived positive predictive values for the conjunction of the two features “irregular shape” and “spiculated margin” differ significantly depending on whether conditional dependencies are incorporated into the decision process or not. It is concluded that radiographic image features should not generally be regarded as conditionally independent diagnostic tests.

  4. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  5. Preliminary Flight Results of the Microelectronics and Photonics Test Bed: NASA DR1773 Fiber Optic Data Bus Experiment

    Science.gov (United States)

    Jackson, George L.; LaBel, Kenneth A.; Marshall, Cheryl; Barth, Janet; Seidleck, Christina; Marshall, Paul

    1998-01-01

    NASA Goddard Spare Flight Center's (GSFC) Dual Rate 1773 (DR1773) Experiment on the Microelectronic and Photonic Test Bed (MPTB) has provided valuable information on the performance of the AS 1773 fiber optic data bus in the space radiation environment. Correlation of preliminary experiment data to ground based radiation test results show the AS 1773 bus is employable in future spacecraft applications requiring radiation tolerant communication links.

  6. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  7. Performance of Small Bore 60NiTi Hybrid Ball Bearings: Preliminary Life Test Results

    Science.gov (United States)

    Dellacorte, Christopher; Howard, S. Adam

    2016-01-01

    Small bore (R8 size) hybrid ball bearings made with 60NiTi races and silicon nitride balls are under development for highly corrosive aerospace applications that are also exposed to heavy static (shock) loads. The target application is the vacuum pump used inside the wastewater recycling system on the International Space Station. To verify bearing longevity, life tests are run at 2000rpm for time periods up to 5000 hours. Accelerometers with data tracking are used to monitor operation and the bearings are disassembled and inspected at intervals to assess wear. Preliminary tests show that bearings made from 60NiTi are feasible for this aerospace and potentially other industrial applications that must endure similar operating environments.

  8. A Preliminary Analysis for SMART-ITL SBLOCA Tests using the MARS/KS Code

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yeon Sik; Ko, Yung Joo; Suh, Jae Seung [System Engineering and Technology Co., Ltd., Daejeon (Korea, Republic of)

    2013-10-15

    In this paper, a preliminary analysis was conducted for SMART-ITL SBLOCA tests using the MARS/KS Code. The results of this work are expected to be good guidelines for SBLOCA tests with the SMART-ITL, and used to understand the various thermal-hydraulic phenomena expected to occur in the integral-type reactor, SMART. An integral-effect test (IET) loop for SMART, SMART-ITL (or FESTA), has been designed using a volume scaling methodology. It was installed at KAERI and its commissioning tests were finished in 2012. Its height was preserved and its area and volume were scaled down to 1/49 compared with the prototype plant, SMART. The SMART-ITL consists of a primary system including a reactor pressure vessel with a pressurizer, four steam generators and four main coolant pumps, a secondary system, a safety system, and an auxiliary system. The objectives of IET using the SMART-ITL facility are to investigate the integral performance of the inter-connected components and possible thermal-hydraulic phenomena occurring in the SMART design, and to validate its safety for various design basis events (DBAs)

  9. A Preliminary Analysis for SMART-ITL SBLOCA Tests using the MARS/KS Code

    International Nuclear Information System (INIS)

    Cho, Yeon Sik; Ko, Yung Joo; Suh, Jae Seung

    2013-01-01

    In this paper, a preliminary analysis was conducted for SMART-ITL SBLOCA tests using the MARS/KS Code. The results of this work are expected to be good guidelines for SBLOCA tests with the SMART-ITL, and used to understand the various thermal-hydraulic phenomena expected to occur in the integral-type reactor, SMART. An integral-effect test (IET) loop for SMART, SMART-ITL (or FESTA), has been designed using a volume scaling methodology. It was installed at KAERI and its commissioning tests were finished in 2012. Its height was preserved and its area and volume were scaled down to 1/49 compared with the prototype plant, SMART. The SMART-ITL consists of a primary system including a reactor pressure vessel with a pressurizer, four steam generators and four main coolant pumps, a secondary system, a safety system, and an auxiliary system. The objectives of IET using the SMART-ITL facility are to investigate the integral performance of the inter-connected components and possible thermal-hydraulic phenomena occurring in the SMART design, and to validate its safety for various design basis events (DBAs)

  10. The Use of Conditional Probability Integral Transformation Method for Testing Accelerated Failure Time Models

    Directory of Open Access Journals (Sweden)

    Abdalla Ahmed Abdel-Ghaly

    2016-06-01

    Full Text Available This paper suggests the use of the conditional probability integral transformation (CPIT method as a goodness of fit (GOF technique in the field of accelerated life testing (ALT, specifically for validating the underlying distributional assumption in accelerated failure time (AFT model. The method is based on transforming the data into independent and identically distributed (i.i.d Uniform (0, 1 random variables and then applying the modified Watson statistic to test the uniformity of the transformed random variables. This technique is used to validate each of the exponential, Weibull and lognormal distributions' assumptions in AFT model under constant stress and complete sampling. The performance of the CPIT method is investigated via a simulation study. It is concluded that this method performs well in case of exponential and lognormal distributions. Finally, a real life example is provided to illustrate the application of the proposed procedure.

  11. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  12. Nuclear maintenance strategy and first steps for preliminary maintenance plan of the EU HCLL & HCPB Test Blanket Systems

    Energy Technology Data Exchange (ETDEWEB)

    Galabert, Jose, E-mail: jose.galabert@f4e.europa.eu [F4E Fusion for Energy, EU Domestic Agency, c/Josep Pla, 2. B3, 08019, Barcelona (Spain); Hopper, Dave [AMEC Foster Wheeler, Faraday Street, Birchwood Park, WA3 6GN (United Kingdom); Neviere, Jean-Cristophe [ITER Organization, Route de Vinon-sur-Verdon, CS 90046, 13067, St. Paul Lez Durance Cedex (France); Nodwell, David [CCFE, Culham Science Centre, Abingdon, OX14 3DB, Oxfordshire (United Kingdom); Pascal, Romain [ITER Organization, Route de Vinon-sur-Verdon, CS 90046, 13067, St. Paul Lez Durance Cedex (France); Poitevin, Yves; Ricapito, Italo [F4E Fusion for Energy, EU Domestic Agency, c/Josep Pla, 2. B3, 08019, Barcelona (Spain); White, Gareth [AMEC Foster Wheeler, Faraday Street, Birchwood Park, WA3 6GN (United Kingdom)

    2017-03-15

    Highlights: • Nuclear maintenance strategy for the two European (EU) Test Blanket Systems (TBS): i/. Helium Cooled Lead Lithium (HCLL) and ii/. Helium Cooled Pebble Bed (HCPB). • Preliminary identification of maintenance tasks for most relevant components of the EU HCLL & HCPB TBS. • Preliminary feasibility analysis for hands-on maintenance tasks of some relevant components of the European Test Blanket Systems. • Design recommendations for enhancement of the European Test Blanket Systems maintainability. - Abstract: This paper gives an overview of nuclear maintenance strategy to be followed for the European HCLL & HCPB Test Blanket Systems (TBS) to be installed in ITER. One of the several core documents to prepare in view of their licensing is their respective ‘Maintenance Plan’. This document is fundamental for ensuring sound performance and safety of the TBS during ITER’s operational phase and shall include, amongst others, relevant information on: maintenance organization, preventive and corrective maintenance task procedures, condition monitoring for key components, maintenance work planning, and a spare parts plan, just to mention some of the key topics. In compliance with the ITER Plant Maintenance policy, first steps have been taken aimed at defining nuclear maintenance strategy for some of the most relevant HCLL & HCPB TBS components, conducted by F4E in collaboration with industry. After a brief recall of maintenance strategy of the TBM Program (PBS-56), this paper analyses main features of EU HCLL & HCPB TBS maintainability and identifies, at their conceptual design phase, a preliminary list of maintenance tasks to be developed for their most representative components. In addition, the paper also presents the first nuclear maintenance studies conducted for replacement of the Q{sub 2} Getter Beds, identifying some design recommendations for their sound maintainability.

  13. Nuclear maintenance strategy and first steps for preliminary maintenance plan of the EU HCLL & HCPB Test Blanket Systems

    International Nuclear Information System (INIS)

    Galabert, Jose; Hopper, Dave; Neviere, Jean-Cristophe; Nodwell, David; Pascal, Romain; Poitevin, Yves; Ricapito, Italo; White, Gareth

    2017-01-01

    Highlights: • Nuclear maintenance strategy for the two European (EU) Test Blanket Systems (TBS): i/. Helium Cooled Lead Lithium (HCLL) and ii/. Helium Cooled Pebble Bed (HCPB). • Preliminary identification of maintenance tasks for most relevant components of the EU HCLL & HCPB TBS. • Preliminary feasibility analysis for hands-on maintenance tasks of some relevant components of the European Test Blanket Systems. • Design recommendations for enhancement of the European Test Blanket Systems maintainability. - Abstract: This paper gives an overview of nuclear maintenance strategy to be followed for the European HCLL & HCPB Test Blanket Systems (TBS) to be installed in ITER. One of the several core documents to prepare in view of their licensing is their respective ‘Maintenance Plan’. This document is fundamental for ensuring sound performance and safety of the TBS during ITER’s operational phase and shall include, amongst others, relevant information on: maintenance organization, preventive and corrective maintenance task procedures, condition monitoring for key components, maintenance work planning, and a spare parts plan, just to mention some of the key topics. In compliance with the ITER Plant Maintenance policy, first steps have been taken aimed at defining nuclear maintenance strategy for some of the most relevant HCLL & HCPB TBS components, conducted by F4E in collaboration with industry. After a brief recall of maintenance strategy of the TBM Program (PBS-56), this paper analyses main features of EU HCLL & HCPB TBS maintainability and identifies, at their conceptual design phase, a preliminary list of maintenance tasks to be developed for their most representative components. In addition, the paper also presents the first nuclear maintenance studies conducted for replacement of the Q_2 Getter Beds, identifying some design recommendations for their sound maintainability.

  14. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  15. Preliminary evaluation of 30 potential granitic rock sites for a radioactive waste storage facility in southern Nevada

    International Nuclear Information System (INIS)

    Boardman, C.R.; Knutson, C.F.

    1978-01-01

    Results of preliminary study are presented which was performed under subtask 2.7 of the NTS Terminal Waste Storage Program Plan for 1978. Subtask 2.7 examines the feasibility of locating a nuclear waste repository in a granitic stock or pluton in southern Nevada near the Nevada Test Site (NTS). It is assumed for the purposes of this study that such a repository cannot be located at NTS. This assumption may or may not be correct. This preliminary report does not identify a particular site as being a suitable location for a repository. Nor does it absolutely eliminate a particular site from further consideration. It does, however, answer the basic question of probable suitability of some of the sites and present a systematic method for site evaluation. Since the findings of this initial study have been favorable, it will be followed by more exhaustive and detailed studies of the original 30 sites and perhaps others. In future studies some of the evaluation criteria used in the preliminary study may be modified or eliminated, and new criteria may be introduced

  16. Preliminary evaluation of 30 potential granitic rock sites for a radioactive waste storage facility in southern Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Boardman, C.R.; Knutson, C.F.

    1978-02-15

    Results of preliminary study are presented which was performed under subtask 2.7 of the NTS Terminal Waste Storage Program Plan for 1978. Subtask 2.7 examines the feasibility of locating a nuclear waste repository in a granitic stock or pluton in southern Nevada near the Nevada Test Site (NTS). It is assumed for the purposes of this study that such a repository cannot be located at NTS. This assumption may or may not be correct. This preliminary report does not identify a particular site as being a suitable location for a repository. Nor does it absolutely eliminate a particular site from further consideration. It does, however, answer the basic question of probable suitability of some of the sites and present a systematic method for site evaluation. Since the findings of this initial study have been favorable, it will be followed by more exhaustive and detailed studies of the original 30 sites and perhaps others. In future studies some of the evaluation criteria used in the preliminary study may be modified or eliminated, and new criteria may be introduced.

  17. The large-scale vented combustion test facility at AECL-WL: description and preliminary test results

    International Nuclear Information System (INIS)

    Loesel Sitar, J.; Koroll, G.W.; Dewit, W.A.; Bowles, E.M.; Harding, J.; Sabanski, C.L.; Kumar, R.K.

    1997-01-01

    Implementation of hydrogen mitigation systems in nuclear reactor containments requires testing the effectiveness of the mitigation system, reliability and availability of the hardware, potential consequences of its use and the technical basis for hardware placement, on a meaningful scale. Similarly, the development and validation of containment codes used in nuclear reactor safety analysis require detailed combustion data from medium- and large-scale facilities. A Large-Scale Combustion Test Facility measuring 10 m x 4 m x 3 m (volume, 120 m 3 ) has been constructed and commissioned at Whiteshell Laboratories to perform a wide variety of combustion experiments. The facility is designed to be versatile so that many geometrical configurations can be achieved. The facility incorporates extensive capabilities for instrumentation and high speed data acquisition, on-line gas sampling and analysis. Other features of the facility include operation at elevated temperatures up to 150 degrees C, easy access to the interior, and remote operation. Initial thermodynamic conditions in the facility can be controlled to within 0.1 vol% of constituent gases. The first series of experiments examined vented combustion in the full 120 m 3 -volume configuration with vent areas in the range of 0.56 to 2.24 m 2 . The experiments were performed at ∼27 degrees C and near-atmospheric pressures, with hydrogen concentrations in the range of 8 to 12% by volume. This paper describes the Large-Scale Vented Combustion Test Facility and preliminary results from the first series of experiments. (author)

  18. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Optimal selection for BRCA1 and BRCA2 mutation testing using a combination of ' easy to apply ' probability models

    NARCIS (Netherlands)

    Bodmer, D.; Ligtenberg, M. J. L.; van der Hout, A. H.; Gloudemans, S.; Ansink, K.; Oosterwijk, J. C.; Hoogerbrugge, N.

    2006-01-01

    To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models. In a retrospective study of 263 families with breast and/or ovarian cancer patients, the utility of

  20. Preliminary irradiation test results from the Yankee Atomic Electric Company reactor vessel test irradiation program

    International Nuclear Information System (INIS)

    Biemiller, E.C.; Fyfitch, S.; Campbell, C.A.

    1993-01-01

    The Yankee Atomic Electric Company test irradiation program was implemented to characterize the irradiation response of representative Yankee Rowe reactor vessel beltline plate materials and to remove uncertainties in the analysis of existing irradiation data on the Yankee Rowe reactor vessel steel. Plate materials each containing 0.24 w/o copper, but different nickel contents at 0.63 w/o and 0.19 w/o, were heat treated to simulate the Yankee vessel heat treatment (austenitized at 1800 deg F) and to simulate Regulatory Guide 1.99 database materials (austenitized at 1600 deg. F). These heat treatments produced different microstructures so the effect of microstructure on irradiation damage sensitivity could be tested. Because the nickel content of the test plates varied and the copper level was constant, the effect of nickel on irradiation embrittlement was also tested. Correlation monitor material, HSST-02, was included in the program to benchmark the Ford Nuclear Reactor (U. of Michigan Test Reactor) which had never been used for this type of irradiation program. Materials taken from plate surface locations (vs. 1/4T) were included to test whether or not the improved toughness properties of the plate surface layer, resulting from the rapid quench, is maintained after irradiation. If the improved properties are maintained, pressurized thermal shock calculations could utilize this margin. Finally, for one experiment, irradiations were conducted at two irradiation temperatures (500 deg. F and 550 deg. F) to determine the effect of irradiation temperature on embrittlement. The preliminary results of the irradiation program show an increase in T 30 shift of 69 deg. F for a decrease in irradiation temperature of 50 deg. F. The results suggest that for nickel bearing steels, the superior toughness of plate surface material is maintained after irradiation and for the copper content tested, nickel had no apparent effect on irradiation response. No apparent microstructure

  1. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  2. In situ vitrification: Preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-02-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: (1) determine large-scale processing performance and (2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. Because of the test's successful completion, within a year technical data on the vitrified soil will be available to determine how well the process incorporates transuranics into the waste form and how well the form resists leaching of transuranics. Preliminary results available include retention of transuranics and other elements within the waste form during processing and the efficiency of the off-gas treatment system in removing contaminants from the gaseous effluents. 13 refs., 10 figs., 5 tabs

  3. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  4. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde

    2010-01-01

    Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies.......Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... impact at different geographical areas, as well as driving and charging patterns. Electric circuit model is deployed in this work to represent the electrical properties of a lithium-ion battery. This paper reports the preliminary modeling and validation work based on manufacturer data sheet and realistic...

  5. To test or not to test

    DEFF Research Database (Denmark)

    Rochon, Justine; Gondan, Matthias; Kieser, Meinhard

    2012-01-01

    Background: Student's two-sample t test is generally used for comparing the means of two independent samples, for example, two treatment arms. Under the null hypothesis, the t test assumes that the two samples arise from the same normally distributed population with unknown variance. Adequate...... control of the Type I error requires that the normality assumption holds, which is often examined by means of a preliminary Shapiro-Wilk test. The following two-stage procedure is widely accepted: If the preliminary test for normality is not significant, the t test is used; if the preliminary test rejects...... the null hypothesis of normality, a nonparametric test is applied in the main analysis. Methods: Equally sized samples were drawn from exponential, uniform, and normal distributions. The two-sample t test was conducted if either both samples (Strategy I) or the collapsed set of residuals from both samples...

  6. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  7. Test results with the Transrapid 06. System data from preliminary trials

    Energy Technology Data Exchange (ETDEWEB)

    Heinrich, K; Mnich, P

    1987-10-01

    Following the takeover of the Transrapid maglev facility by MVP, in spite of remaining preparatory work and conversion of the support and guidance system on the basis of a new electronic generation, interesting system data could be obtained experimentally before the planned continuous trials phase. Although the full test track length is not yet available - it is at present only 20.5 km - more than 25,000 km have already been covered in almost 1,200 test runs. Some 200 of these were for the purpose of demonstrating the Transrapid technology to visting German and foreign experts. The system data obtained from the preliminary trails were positive. Any weak points noted were mainly site-specific and not maglev-specific, but in spite of generally satisfactory results there are still many individual aspects calling for improvement and optimation before the technology can be declared ready for service. Proceeding from the positive trend of the system data obtained at up to 355 km/h, it can be said that proof of serviceability of the Transrapid transport system at speeds of up to 400 km/h can be provided in the next two years. (orig.).

  8. Optimization of continuous ranked probability score using PSO

    Directory of Open Access Journals (Sweden)

    Seyedeh Atefeh Mohammadi

    2015-07-01

    Full Text Available Weather forecast has been a major concern in various industries such as agriculture, aviation, maritime, tourism, transportation, etc. A good weather prediction may reduce natural disasters and unexpected events. This paper presents an empirical investigation to predict weather temperature using continuous ranked probability score (CRPS. The mean and standard deviation of normal density function are linear combination of the components of ensemble system. The resulted optimization model has been solved using particle swarm optimization (PSO and the results are compared with Broyden–Fletcher–Goldfarb–Shanno (BFGS method. The preliminary results indicate that the proposed PSO provides better results in terms of root-mean-square deviation criteria than the alternative BFGS method.

  9. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  10. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    Science.gov (United States)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  11. Preliminary test results from a telescope of Hughes pixel arrays at FNAL

    International Nuclear Information System (INIS)

    Jernigan, J.G.; Arens, J.; Vezie, D.; Collins, T.; Krider, J.; Skubic, P.

    1992-09-01

    In December of 1991 three silicon hybrid pixel detectors each having 2.56 x 2.56 pixels 30 μm square, made by the Hughes Aircraft Company, were placed in a high energy muon beam at the Fermi National Accelerator Laboratory. Straight tracks were recorded in these detectors at angles to the normal to the plane of the silicon ranging from 0 to 45 degrees. In this note, preliminary results are presented on the straight through tracks, i.e., those passing through the telescope at normal incidence. Pulse height data, signal-to-noise data, and preliminary straight line fits to the data resulting in residual distributions are presented. Preliminary calculations show spatial resolution of less than 5 μm in two dimensions

  12. Preliminary test results from a free-piston Stirling engine technology demonstration program to support advanced radioisotope space power applications

    International Nuclear Information System (INIS)

    White, Maurice A.; Qiu Songgang; Augenblick, Jack E.

    2000-01-01

    Free-piston Stirling engines offer a relatively mature, proven, long-life technology that is well-suited for advanced, high-efficiency radioisotope space power systems. Contracts from DOE and NASA are being conducted by Stirling Technology Company (STC) for the purpose of demonstrating the Stirling technology in a configuration and power level that is representative of an eventual space power system. The long-term objective is to develop a power system with an efficiency exceeding 20% that can function with a high degree of reliability for up to 15 years on deep space missions. The current technology demonstration convertors (TDC's) are completing shakedown testing and have recently demonstrated performance levels that are virtually identical to projections made during the preliminary design phase. This paper describes preliminary test results for power output, efficiency, and vibration levels. These early results demonstrate the ability of the free-piston Stirling technology to exceed objectives by approximately quadrupling the efficiency of conventional radioisotope thermoelectric generators (RTG's)

  13. Preliminary test results from a free-piston Stirling engine technology demonstration program to support advanced radioisotope space power applications

    Science.gov (United States)

    White, Maurice A.; Qiu, Songgang; Augenblick, Jack E.

    2000-01-01

    Free-piston Stirling engines offer a relatively mature, proven, long-life technology that is well-suited for advanced, high-efficiency radioisotope space power systems. Contracts from DOE and NASA are being conducted by Stirling Technology Company (STC) for the purpose of demonstrating the Stirling technology in a configuration and power level that is representative of an eventual space power system. The long-term objective is to develop a power system with an efficiency exceeding 20% that can function with a high degree of reliability for up to 15 years on deep space missions. The current technology demonstration convertors (TDC's) are completing shakedown testing and have recently demonstrated performance levels that are virtually identical to projections made during the preliminary design phase. This paper describes preliminary test results for power output, efficiency, and vibration levels. These early results demonstrate the ability of the free-piston Stirling technology to exceed objectives by approximately quadrupling the efficiency of conventional radioisotope thermoelectric generators (RTG's). .

  14. Safety of high speed ground transportation systems: X2000 US demonstration vehicle dynamics trials, preliminary test report. Report for October 1992-January 1993

    Energy Technology Data Exchange (ETDEWEB)

    Whitten, B.T.; Kesler, J.K.

    1993-01-01

    The report documents the procedures, events, and results of vehicle dynamic tests carried out on the ASEA-Brown Boveri (ABB) X2000 tilt body trainset in the US between October 1992 and January 1993. These tests, sponsored by Amtrak and supported by the FRA, were conducted to assess the suitability of the X2000 trainset for safe operation at elevated cant deficiencies and speeds in Amtrak's Northeast Corridor under existing track conditions in a revenue service demonstration. The report describes the safety criteria against which the performance of the X2000 test train was examined, the instrumentation used, the test locations, and the track conditions. Preliminary results are presented from tests conducted on Amtrak lines between Philadelphia and Harrisburg, PA, and between Washington DC and New York NY, in which cant deficiencies of 12.5 inches and speeds of 154 mph were reached in a safe and controlled manner. The significance of the results is discussed, and preliminary conclusions and recommendations are presented.

  15. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  16. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  17. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    Science.gov (United States)

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured

  18. Mercury exposure on potential plant Ludwigia octovalvis L. - Preliminary toxicological testing

    Science.gov (United States)

    Alrawiq, Huda S. M.; Mushrifah, I.

    2013-11-01

    The preliminary test in phytoremediation is necessaryto determine the ability of plant to survive in media with different concentrations of contaminant. It was conducted to determine the maximum concentration of the contaminant that isharmful to the plant and suppress the plant growth. This study showed the ability of Ludwigia octovalvisto resist mercury (Hg) contaminant in sand containing different concentrations of Hg (0, 0.5, 1, 2, 4, 6 and 8 mg/L). The experimental work wasperformed under greenhouse conditions for an observation period of 4 weeks. Throughout the 4 weeks duration, the resultsshowed that 66.66% of the plants withered for on exposure to Hg concentration of 4 mg/L and 100% withered at higher concentrations of 6 and 8 mg/L. The results of this study may serve as a basis for research that aims to study uptake and accumulation of Hg using potential phytoremediation plants.

  19. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  20. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  1. Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS

    Energy Technology Data Exchange (ETDEWEB)

    Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-30

    The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To support this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.

  2. Testing a Preliminary Live with Love Conceptual Framework for cancer couple dyads: A mixed-methods study.

    Science.gov (United States)

    Li, Qiuping; Xu, Yinghua; Zhou, Huiya; Loke, Alice Yuen

    2015-12-01

    The purpose of this study was to test the previous proposed Preliminary Live with Love Conceptual Framework (P-LLCF) that focuses on spousal caregiver-patient couples in their journey of coping with cancer as dyads. A mixed-methods study that included qualitative and quantitative approaches was conducted. Methods of concept and theory analysis, and structural equation modeling (SEM) were applied in testing the P-LLCF. In the qualitative approach in testing the concepts included in the P-LLCF, a comparison was made between the P-LLCF with a preliminary conceptual framework derived from focus group interviews among Chinese couples' coping with cancer. The comparison showed that the concepts identified in the P-LLCF are relevant to the phenomenon under scrutiny, and attributes of the concepts are consistent with those identified among Chinese cancer couple dyads. In the quantitative study, 117 cancer couples were recruited. The findings showed that inter-relationships exist among the components included in the P-LLCF: event situation, dyadic mediators, dyadic appraisal, dyadic coping, and dyadic outcomes. In that the event situation will impact the dyadic outcomes directly or indirectly through Dyadic Mediators. The dyadic mediators, dyadic appraisal, and dyadic coping are interrelated and work together to benefit the dyadic outcomes. This study provides evidence that supports the interlinked components and the relationship included in the P-LLCF. The findings of this study are important in that they provide healthcare professionals with guidance and directions according to the P-LLCF on how to plan supportive programs for couples coping with cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. PLANT COMMUNITIES OF ALBANIA - A PRELIMINARY OVERVIEW

    Directory of Open Access Journals (Sweden)

    J. DRING

    2002-04-01

    Full Text Available The phytosociological analysis of Albania was initiated by F. Markgraf in the 30ies, but still remains incomplete. This is a preliminary list of the plant communities resulting from the literature and from field research carried out during the last years and may represent a first contribution for further research. Many communities are described only by dominant species, other are quoted as nomina nuda. Some further syntaxa. probably present in the study area, are added.

  4. Contingency bias in probability judgement may arise from ambiguity regarding additional causes.

    Science.gov (United States)

    Mitchell, Chris J; Griffiths, Oren; More, Pranjal; Lovibond, Peter F

    2013-09-01

    In laboratory contingency learning tasks, people usually give accurate estimates of the degree of contingency between a cue and an outcome. However, if they are asked to estimate the probability of the outcome in the presence of the cue, they tend to be biased by the probability of the outcome in the absence of the cue. This bias is often attributed to an automatic contingency detection mechanism, which is said to act via an excitatory associative link to activate the outcome representation at the time of testing. We conducted 3 experiments to test alternative accounts of contingency bias. Participants were exposed to the same outcome probability in the presence of the cue, but different outcome probabilities in the absence of the cue. Phrasing the test question in terms of frequency rather than probability and clarifying the test instructions reduced but did not eliminate contingency bias. However, removal of ambiguity regarding the presence of additional causes during the test phase did eliminate contingency bias. We conclude that contingency bias may be due to ambiguity in the test question, and therefore it does not require postulation of a separate associative link-based mechanism.

  5. Implementing national nuclear safety plan at the preliminary stage of nuclear power project development

    International Nuclear Information System (INIS)

    Xue Yabin; Cui Shaozhang; Pan Fengguo; Zhang Lizhen; Shi Yonggang

    2014-01-01

    This study discusses the importance of nuclear power project design and engineering methods at the preliminary stage of its development on nuclear power plant's operational safety from the professional view. Specifically, we share our understanding of national nuclear safety plan's requirement on new reactor accident probability, technology, site selection, as well as building and improving nuclear safety culture and strengthening public participation, with a focus on plan's implications on preliminary stage of nuclear power project development. Last, we introduce China Huaneng Group's work on nuclear power project preliminary development and the experience accumulated during the process. By analyzing the siting philosophy of nuclear power plant and the necessity of building nuclear safety culture at the preliminary stage of nuclear power project development, this study explicates how to fully implement the nuclear safety plan's requirements at the preliminary stage of nuclear power project development. (authors)

  6. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  7. Long-term retention of a divided attention psycho-motor test combining choice reaction test and postural balance test: A preliminary study.

    Science.gov (United States)

    Rossi, R; Pascolo, P B

    2015-09-01

    Driving in degraded psychophysical conditions, such as under the influence of alcohol or drugs but also in a state of fatigue or drowsiness, is a growing problem. The current roadside tests used for detecting drugs from drivers suffer various limitations, while impairment is subjective and does not necessarily correlate with drug metabolite concentration found in body fluids. This work is a validation step towards the study of feasibility of a novel test conceived to assess psychophysical conditions of individuals performing at-risk activities. Motor gestures, long-term retention and learning phase related to the protocol are analysed in unimpaired subjects. The protocol is a divided attention test, which combines a critical tracking test achieved with postural movements and a visual choice reaction test. Ten healthy subjects participated in a first set of trials and in a second set after about six months. Each session required the carrying out of the test for ten times in order to investigate learning effect and performance over repetitions. In the first set the subjects showed a learning trend up to the third trial, whilst in the second set of trials they showed motor retention. Nevertheless, the overall performance did not significantly improve. Gestures are probably retained due to the type of tasks and the way in which the instructions are conveyed to the subjects. Moreover, motor retention after a short training suggests that the protocol is easy to learn and understand. Implications for roadside test usage and comparison with current tests are also discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The Valuation of Insurance under Uncertainty: Does Information about Probability Matter?

    OpenAIRE

    Carmela Di Mauro; Anna Maffioletti

    2001-01-01

    In a laboratory experiment we test the hypothesis that consumers' valuation of insurance is sensitive to the amount of information available on the probability of a potential loss. In order to test this hypothesis we simulate a market in which we elicit individuals' willingness to pay to insure against a loss characterised either by known or else vague probabilities. We use two distinct treatments by providing subjects with different information over the vague probabilities of loss. In genera...

  9. Environmental Survey preliminary report, Nevada Test Site, Mercury, Nevada

    International Nuclear Information System (INIS)

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Nevada Test Site (NTS), conducted June 22 through July 10, 1987. The Survey is being conducted by a multidisciplinary team of environmental specialists led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team members are outside experts being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the NTS. The Survey covers all environment media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations and activities performed at the NTS, and interviews with site personnel. The Survey team developed a Sampling and Analysis Plan to assist in further assessing certain environmental problems identified during its on-site activities. The Sampling and Analysis Plan is being executed by the Battelle Columbus Division under contract with DOE. When completed, the results will be incorporated into the NTS Environmental Survey Interim Report. The Interim Report will reflect the final determinations of the NTS Survey. 165 refs., 42 figs., 52 tabs

  10. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  11. Preliminary topical report on comparison reactor disassembly calculations

    International Nuclear Information System (INIS)

    McLaughlin, T.P.

    1975-11-01

    Preliminary results of comparison disassembly calculations for a representative LMFBR model (2100-l voided core) and arbitrary accident conditions are described. The analytical methods employed were the computer programs: FX2-POOL, PAD, and VENUS-II. The calculated fission energy depositions are in good agreement, as are measures of the destructive potential of the excursions, kinetic energy, and work. However, in some cases the resulting fuel temperatures are substantially divergent. Differences in the fission energy deposition appear to be attributable to residual inconsistencies in specifying the comparison cases. In contrast, temperature discrepancies probably stem from basic differences in the energy partition models inherent in the codes. Although explanations of the discrepancies are being pursued, the preliminary results indicate that all three computational methods provide a consistent, global characterization of the contrived disassembly accident

  12. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  13. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  14. Preliminary tests of a high speed vertical axis windmill model

    Energy Technology Data Exchange (ETDEWEB)

    South, P; Rangi, R S

    1971-01-01

    This report discusses a fixed-pitch vertical axis windmill that combines the inherent simplicity of this type of machine with a high aerodynamic efficiency and a high relative velocity. A three-bladed rotor was selected as the basic design, having constant chord symmetric airfoil blades configured in a catenary curve such that the rotor diameter is equal to the rotor height. In wind tunnel tests using a 30 inch scale model, it was found that once this rotor was given a very low rotational speed, it picked up speed and ran at a rotor tip velocity/wind speed ratio greater than 1. The number of blades was varied in the testing. A maximum power coefficient of 0.67 was achieved at 17 ft/s wind speed at a tip speed/wind speed ratio of 7.25 for a 2-bladed rotor. Increasing the number of blades above 3 did not result in higher power. The rotor could operate in gusts which double the mean wind velocity. Examination of Reynolds number effects, and taking into account the scale of the model, it was concluded that a full-scale windmill could run at lower velocity ratios than those predicted by the model tests, and that it could self-start under no-load conditions if the cut-in rpm are at least half the rpm for maximum power at the prevailing wind speed. Preliminary estimates show that a 15 ft diameter windmill of this design, designed to operate with a safety factor of 2.5 up to a maximum wind speed of 60 ft/s, would weigh ca 150 lb and could be marketed for ca $60.00, excluding the driven unit, if sufficient quantities were produced to make tooling costs negligible. Similarly, a 30 ft windmill would weigh ca 1000 lb and cost ca $400.00. 2 refs., 6 figs.

  15. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    OpenAIRE

    Matha, Denis; Sandner, Frank; Molins i Borrell, Climent; Campos Hortigüela, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provide...

  16. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  17. Preliminary research on eddy current bobbin quantitative test for heat exchange tube in nuclear power plant

    Science.gov (United States)

    Qi, Pan; Shao, Wenbin; Liao, Shusheng

    2016-02-01

    For quantitative defects detection research on heat transfer tube in nuclear power plants (NPP), two parts of work are carried out based on the crack as the main research objects. (1) Production optimization of calibration tube. Firstly, ASME, RSEM and homemade crack calibration tubes are applied to quantitatively analyze the defects depth on other designed crack test tubes, and then the judgment with quantitative results under crack calibration tube with more accuracy is given. Base on that, weight analysis of influence factors for crack depth quantitative test such as crack orientation, length, volume and so on can be undertaken, which will optimize manufacture technology of calibration tubes. (2) Quantitative optimization of crack depth. Neural network model with multi-calibration curve adopted to optimize natural crack test depth generated in in-service tubes shows preliminary ability to improve quantitative accuracy.

  18. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    CERN Document Server

    Abbrescia, Marcello; Benussi, Luigi; Bianco, Stefano; Cauwenbergh, Simon Marc D; Ferrini, Mauro; Muhammad, Saleh; Passamontic, L; Pierluigi, Daniele; Piccolo, Davide; Primavera, Federica; Russo, Alessandro; Savianoc, G; Tytgat, Michael

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require F-based gases for optimal performance. Recent regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade several tests are ongoing to measure the performance of the detector in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard cms electronic setup are under test. In this talk preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze and with CO2 based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  19. MALLARD REPRODUCTIVE TESTING IN A POND ENVIRONMENT: A PRELIMINARY STUDY

    Science.gov (United States)

    A 2-year preliminary study was conducted on mallard ducks to determine the feasibility of using outdoor pond enclosures for reproductive studies and to evaluate the effects of the insecticide chlorpyrifos on mallard reproduction. No significant reproductive effects were observed ...

  20. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  1. Adaptation and Preliminary Testing of the Developmental Coordination Disorder Questionnaire (DCDQ) for Children in India.

    Science.gov (United States)

    Patel, Priya; Gabbard, Carl

    2017-05-01

    While Developmental Coordination Disorder (DCD) has gained worldwide attention, in India it is relatively unknown. The revised DCD Questionnaire (DCDQ'07) is one of the most utilized screening tools for DCD. The aim of this study was to translate the DCDQ'07 into the Hindi language (DCDQ-Hindi) and test its basic psychometric properties. The DCDQ'07 was translated following guidelines for cross cultural adaptation of instruments. Parents of 1100 children (5-15 years) completed the DCDQ-Hindi, of which 955 were considered for data analysis and 60 were retested randomly after 3 weeks for test-retest reliability. The DCDQ-Hindi showed high internal consistency (α = .86) and moderate test-retest reliability (.73). Confirmatory factor analysis showed equivalence to the DCDQ'07. The% probable DCD using DCDQ'07 cutoff scores (≤57) ranged from 22% to 68%. Using more stringent cutoffs (≤36) it ranged from 5% to 9%. Significant difference was seen for gender (p < .05) in subset 1(gross-motor skills) total scores. The DCDQ-Hindi reveals promise for initial identification of Hindi speaking Indian children with DCD. Based on more stringent cut-off scores, the "probable prevalence" of children with risk of DCD in India appears to be around 6-7%. Research with larger sample and comparison with the MABC-2 or equivalent is needed.

  2. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  3. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  4. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  5. Emotional profiles to the Rorschach test in subjects affected by Central Serous Chorioretinopathy: preliminary observations

    Directory of Open Access Journals (Sweden)

    Giovanna Gioffrè

    2013-05-01

    Full Text Available Psychological variables could be related to disorders of vision with particular interest of depressive feautures, but with little attention to dimensions such as stress and anxiety. Psychological stress associated with hyperactivation of the sympathetic autonomic nervous system, is considered the most important risk factor of a rare disorder of vision, the Central Serous Chorioretinopathy (CSC, whose etiology has not yet been clarified. This study to examine the psychological literature regarding to CSC and explore in a preliminary the projective methods of the Rorschach test, any correlations between personality variables and predisposition to CSC.

  6. First simultaneous measurement of fission and gamma probabilities of 237U and 239Np via surrogate reactions

    Directory of Open Access Journals (Sweden)

    Marini P.

    2016-01-01

    Full Text Available Fission and gamma decay probabilities of 237U and 239Np have been measured, for the first time simultaneously in dedicated experiments, via the surrogate reactions 238U(3He, 4He and 238U(3He,d, respectively. While a good agreement between our data and neutron-induced data is found for fission probabilities, gamma decay probabilities are several times higher than the corresponding neutron-induced data for each studied nucleus. We study the role of the different spin distributions populated in the surrogate and neutron-induced reactions. The compound nucleus spin distribution populated in the surrogate reaction is extracted from the measured gamma-decay probabilities, and used as input parameter in the statistical model to predict fission probabilities to be compared to our data. A strong disagreement between our data and the prediction is obtained. Preliminary results from an additional dedicated experiment confirm the observed discrepancies, indicating the need of a better understanding of the formation and decay processes of the compound nucleus.

  7. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  8. Closed Form Aliasing Probability For Q-ary Symmetric Errors

    Directory of Open Access Journals (Sweden)

    Geetani Edirisooriya

    1996-01-01

    Full Text Available In Built-In Self-Test (BIST techniques, test data reduction can be achieved using Linear Feedback Shift Registers (LFSRs. A faulty circuit may escape detection due to loss of information inherent to data compaction schemes. This is referred to as aliasing. The probability of aliasing in Multiple-Input Shift-Registers (MISRs has been studied under various bit error models. By modeling the signature analyzer as a Markov process we show that the closed form expression derived for aliasing probability previously, for MISRs with primitive polynomials under q-ary symmetric error model holds for all MISRs irrespective of their feedback polynomials and for group cellular automata signature analyzers as well. If the erroneous behaviour of a circuit can be modelled with q-ary symmetric errors, then the test circuit complexity and propagation delay associated with the signature analyzer can be minimized by using a set of m single bit LFSRs without increasing the probability of aliasing.

  9. Preliminary design of steam reformer in out-pile demonstration test facility for HTTR heat utilization system

    Energy Technology Data Exchange (ETDEWEB)

    Haga, Katsuhiro; Hino, Ryutaro; Inagaki, Yosiyuki; Hata, Kazuhiko; Aita, Hideki; Sekita, Kenji; Nishihara, Tetsuo; Sudo, Yukio [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Yamada, Seiya

    1996-11-01

    One of the key objectives of HTTR is to demonstrate effectiveness of high-temperature nuclear heat utilization system. Prior to connecting a heat utilization system to HTTR, an out-pile demonstration test is indispensable for the development of experimental apparatuses, operational control and safety technology, and verification of the analysis code of safety assessment. For the first heat utilization system of HTTR, design of the hydrogen production system by steam reforming is going on. We have proposed the out-pile demonstration test plan of the heat utilization system and conducted preliminary design of the test facility. In this report, design of the steam reformer, which is the principal component of the test facility, is described. In the course of the design, two types of reformers are considered. The one reformer contains three reactor tubes and the other contains one reactor tube to reduce the construction cost of the test facility. We have selected the steam reformer operational conditions and structural specifications by analyzing the steam reforming characteristics and component structural strength for each type of reformer. (author)

  10. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  11. A preliminary study in osteoinduction by a nano-crystalline hydroxyapatite in the mini pig

    Directory of Open Access Journals (Sweden)

    Werner Götz

    2010-04-01

    Full Text Available To test the probable osteoinductive properties of NanoBone®, a new highly non-sintered porous nano-crystallinehydroxylapatite bone substitute embedded into a silica gel matrix, granules were implanted subcutaneously and intramuscularlyinto the back region of 18 mini pigs. After periods of 5 and 10 weeks as well as 4 and 8 months, implantation siteswere investigated using histological and histomorphometric procedures. Signs of early osteogenesis could already be detectedafter 5 weeks. The later periods were characterized by increasing membranous osteogenesis in and around the granulesleading to the formation of bone-like structures showing periosteal and tendon-like structures with bone marrow and focalchondrogenesis. Bone formation was better in the subcutaneous than in the intramuscular implantation sites. This ectopicosteogenesis is discussed with regard to the nanoporosity and microporosity of the material, physico-chemical interactionsat its surface, the differentiation of osteoblasts, the role of angiogenesis and the probable involvement of growth factors. Theresults of this preliminary study indicate that this biomaterial has osteoinductive potential and induces the formation of bonestructures, mainly in subcutaneous adipose tissue in the pig.

  12. Using probability modelling and genetic parentage assignment to test the role of local mate availability in mating system variation.

    Science.gov (United States)

    Blyton, Michaela D J; Banks, Sam C; Peakall, Rod; Lindenmayer, David B

    2012-02-01

    The formal testing of mating system theories with empirical data is important for evaluating the relative importance of different processes in shaping mating systems in wild populations. Here, we present a generally applicable probability modelling framework to test the role of local mate availability in determining a population's level of genetic monogamy. We provide a significance test for detecting departures in observed mating patterns from model expectations based on mate availability alone, allowing the presence and direction of behavioural effects to be inferred. The assessment of mate availability can be flexible and in this study it was based on population density, sex ratio and spatial arrangement. This approach provides a useful tool for (1) isolating the effect of mate availability in variable mating systems and (2) in combination with genetic parentage analyses, gaining insights into the nature of mating behaviours in elusive species. To illustrate this modelling approach, we have applied it to investigate the variable mating system of the mountain brushtail possum (Trichosurus cunninghami) and compared the model expectations with the outcomes of genetic parentage analysis over an 18-year study. The observed level of monogamy was higher than predicted under the model. Thus, behavioural traits, such as mate guarding or selective mate choice, may increase the population level of monogamy. We show that combining genetic parentage data with probability modelling can facilitate an improved understanding of the complex interactions between behavioural adaptations and demographic dynamics in driving mating system variation. © 2011 Blackwell Publishing Ltd.

  13. Development and Preliminary Testing of a High Precision Long Stroke Slit Change Mechanism for the SPICE Instrument

    Science.gov (United States)

    Paciotti, Gabriel; Humphries, Martin; Rottmeier, Fabrice; Blecha, Luc

    2014-01-01

    In the frame of ESA's Solar Orbiter scientific mission, Almatech has been selected to design, develop and test the Slit Change Mechanism of the SPICE (SPectral Imaging of the Coronal Environment) instrument. In order to guaranty optical cleanliness level while fulfilling stringent positioning accuracies and repeatability requirements for slit positioning in the optical path of the instrument, a linear guiding system based on a double flexible blade arrangement has been selected. The four different slits to be used for the SPICE instrument resulted in a total stroke of 16.5 mm in this linear slit changer arrangement. The combination of long stroke and high precision positioning requirements has been identified as the main design challenge to be validated through breadboard models testing. This paper presents the development of SPICE's Slit Change Mechanism (SCM) and the two-step validation tests successfully performed on breadboard models of its flexible blade support system. The validation test results have demonstrated the full adequacy of the flexible blade guiding system implemented in SPICE's Slit Change Mechanism in a stand-alone configuration. Further breadboard test results, studying the influence of the compliant connection to the SCM linear actuator on an enhanced flexible guiding system design have shown significant enhancements in the positioning accuracy and repeatability of the selected flexible guiding system. Preliminary evaluation of the linear actuator design, including a detailed tolerance analyses, has shown the suitability of this satellite roller screw based mechanism for the actuation of the tested flexible guiding system and compliant connection. The presented development and preliminary testing of the high-precision long-stroke Slit Change Mechanism for the SPICE Instrument are considered fully successful such that future tests considering the full Slit Change Mechanism can be performed, with the gained confidence, directly on a

  14. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  15. Environmental Survey preliminary report, Nevada Test Site, Mercury, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Nevada Test Site (NTS), conducted June 22 through July 10, 1987. The Survey is being conducted by a multidisciplinary team of environmental specialists led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team members are outside experts being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the NTS. The Survey covers all environment media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations and activities performed at the NTS, and interviews with site personnel. The Survey team developed a Sampling and Analysis Plan to assist in further assessing certain environmental problems identified during its on-site activities. The Sampling and Analysis Plan is being executed by the Battelle Columbus Division under contract with DOE. When completed, the results will be incorporated into the NTS Environmental Survey Interim Report. The Interim Report will reflect the final determinations of the NTS Survey. 165 refs., 42 figs., 52 tabs.

  16. Preliminary I&C Design for LORELEI

    International Nuclear Information System (INIS)

    Korotkin, S.; Kaufman, Y.; Guttmann, E. B.; Levy, S.; Amidan, D.; Gdalyho, B.; Cahana, T.; Ellenbogen, A.; Arad, M.; Weiss, Y.; Sasson, A.; Ferry, L.; Bourrelly, F.; Cohen, Y.

    2014-01-01

    This document summarizes the preliminary I&C design for LORELEI experiment The preliminary design deals with considerations regarding appropriate safety and service instrumentation. The determined closed loop control rules for temperature and position will be implemented in the detailed design. The Computer Aided Operator Decisions System (CAODS) will be used for prediction of hot spot temperature and thickness of oxidation layer using Baker-Just correlation. The proposed hybrid simulation system comprising of both virtual and real hardware will be in-cooperated for LORELEI verification. It will perform both integration cold tests for a partial hardware loop and virtual tests for the final I&C design

  17. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  18. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    Science.gov (United States)

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of  0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For

  19. Constraint-based Student Modelling in Probability Story Problems with Scaffolding Techniques

    Directory of Open Access Journals (Sweden)

    Nabila Khodeir

    2018-01-01

    Full Text Available Constraint-based student modelling (CBM is an important technique employed in intelligent tutoring systems to model student knowledge to provide relevant assistance. This paper introduces the Math Story Problem Tutor (MAST, a Web-based intelligent tutoring system for probability story problems, which is able to generate problems of different contexts, types and difficulty levels for self-paced learning. Constraints in MAST are specified at a low-level of granularity to allow fine-grained diagnosis of the student error. Furthermore, MAST extends CBM to address errors due to misunderstanding of the narrative story. It can locate and highlight keywords that may have been overlooked or misunderstood leading to an error. This is achieved by utilizing the role of sentences and keywords that are defined through the Natural Language Generation (NLG methods deployed in the story problem generation. MAST also integrates CBM with scaffolding questions and feedback to provide various forms of help and guidance to the student. This allows the student to discover and correct any errors in his/her solution. MAST has been preliminary evaluated empirically and the results show the potential effectiveness in tutoring students with a decrease in the percentage of violated constraints along the learning curve. Additionally, there is a significant improvement in the results of the post–test exam in comparison to the pre-test exam of the students using MAST in comparison to those relying on the textbook

  20. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  1. Preliminary tests in skull pediatric phantom for dosimetry in computerized tomography

    International Nuclear Information System (INIS)

    Martins, Elaine Wirney; Potiens, Maria da Penha de A.

    2014-01-01

    Computed tomography (CT) is one of the techniques in the field of radiology with striking technological advance in recent years. One reason for this was the increased number of channels associated with the increased power of the X-ray tube. These conditions allowed the equipment high speed in the acquisition of the cuts, reducing the patient exposure time essential characteristic for the increase of its use in pediatric patients. In this context, were developed a new pediatric skull simulator to analyze the results of measurements performed in laboratory and clinic beams with the objective of creation and use of diagnostic reference levels observing risks stochastic effects and assessing the reduction of absorbed doses in patients undergoing growing. Preliminary tests performed in clinical beams showed C w values: 2.525 ± 0.212 mGy for the developed simulator quoted and 3.362 ± 0.282 mGy for a simulator developed by IPEN called standard, both being between uncertainty values of 8.4% and 14.4% suggested by TRS number 457

  2. IRIS: Proceeding Towards the Preliminary Design

    International Nuclear Information System (INIS)

    Carelli, M.; Miller, K.; Lombardi, C.; Todreas, N.; Greenspan, E.; Ninokata, H.; Lopez, F.; Cinotti, L.; Collado, J.; Oriolo, F.; Alonso, G.; Morales, M.; Boroughs, R.; Barroso, A.; Ingersoll, D.; Cavlina, N.

    2002-01-01

    The IRIS (International Reactor Innovative and Secure) project has completed the conceptual design phase and is moving towards completion of the preliminary design, scheduled for the end of 2002. Several other papers presented in this conference provide details on major aspects of the IRIS design. The three most innovative features which uniquely characterize IRIS are, in descending order of impact: 1. Safety-by-design, which takes maximum advantage of the integral configuration to eliminate from consideration some accidents, greatly lessen the consequence of other accident scenarios and decrease their probability of occurring; 2. Optimized maintenance, where the interval between maintenance shutdowns is extended to 48 months; and 3. Long core life, of at least four years without shuffling or partial refueling. Regarding feature 1, design and analyses will be supplemented by an extensive testing campaign to verify and demonstrate the performance of the integral components, individually as well as interactive systems. Test planning is being initiated. Test results will be factored into PRA analyses under an overall risk informed regulation approach, which is planned to be used in the IRIS licensing. Pre-application activities with NRC are also scheduled to start in mid 2002. Regarding feature 2, effort is being focused on advanced online diagnostics for the integral components, first of all the steam generators, which are the most critical component; several techniques are being investigated. Finally, a four year long life core design is well underway and some of the IRIS team members are examining higher enrichment, eight to ten year life cores which could be considered for reloads. (authors)

  3. Preliminary Study on Mg content of hard part(Test) of a benthic foraminifer from the inner shelf, off West Coast of India

    Digital Repository Service at National Institute of Oceanography (India)

    Khare, N.; Nigam, R.; Iyer, S.D.

    A preliminary study has been made for trace element (Mg) in the test of benthic species Bulimina exilis from a shallow sediment core (at 22 m water depth) off west coast of India using Electron Probe Microanalyser (EPMA) The Mg content is selected...

  4. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  5. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  6. Posterior probability of linkage and maximal lod score.

    Science.gov (United States)

    Génin, E; Martinez, M; Clerget-Darpoux, F

    1995-01-01

    To detect linkage between a trait and a marker, Morton (1955) proposed to calculate the lod score z(theta 1) at a given value theta 1 of the recombination fraction. If z(theta 1) reaches +3 then linkage is concluded. However, in practice, lod scores are calculated for different values of the recombination fraction between 0 and 0.5 and the test is based on the maximum value of the lod score Zmax. The impact of this deviation of the test on the probability that in fact linkage does not exist, when linkage was concluded, is documented here. This posterior probability of no linkage can be derived by using Bayes' theorem. It is less than 5% when the lod score at a predetermined theta 1 is used for the test. But, for a Zmax of +3, we showed that it can reach 16.4%. Thus, considering a composite alternative hypothesis instead of a single one decreases the reliability of the test. The reliability decreases rapidly when Zmax is less than +3. Given a Zmax of +2.5, there is a 33% chance that linkage does not exist. Moreover, the posterior probability depends not only on the value of Zmax but also jointly on the family structures and on the genetic model. For a given Zmax, the chance that linkage exists may then vary.

  7. Design and preliminary testing of a Bottom-Mounted Second Shutdown Drive Mechanism for the KJRR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sanghaun; Lee, Jin Haeng; Yoo, Yeon-Sik, E-mail: yooys@kaeri.re.kr; Cho, Yeong-Garp; Lee, Hyokwang; Sun, Jongoh; Ryu, Jeong Soo

    2016-10-15

    Highlights: • The basic design principle, features and characteristics of the BMSSDM for KJRR are described. • The current development status based on practical fabrications, performance tests, and evaluations is described. • We have verified that all of the BMSSDM components satisfied their design requirements. • All of the performance requirements are satisfied from the performance test results. • The endurance test results show there are no structural failures and the wear of the impact parts in the hydraulic cylinder assembly is negligible. - Abstract: The KiJang Research Reactor (KJRR) is now being designed and undergoing preliminary construction by the Korea Atomic Energy Research Institute (KAERI). The driving parts of the Second Shutdown Drive Mechanism (SSDM) for the KJRR are located in a Reactivity Control Mechanism (RCM) room below the reactor pool bottom. In this paper, the design principle and concept of the Bottom-Mounted SSDM (BMSSDM) for the KJRR are introduced. From the experimental evaluations of the design, fabrication and performance, we verified that all of the BMSSDM components in the current design and development status satisfy their design requirements.

  8. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    International Nuclear Information System (INIS)

    Abbrescia, M.; Muhammad, S.; Saviano, G.; Auwegem, P. Van; Cauwenbergh, S.; Tytgat, M.; Benussi, L.; Bianco, S.; Passamonti, L.; Pierluigi, D.; Piccolo, D.; Primavera, F.; Russo, A.; Ferrini, M.

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require Fluorine based (F-based) gases for optimal performance. Recent European regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade, several tests are ongoing to measure the performance of the detector with these new ecological gas mixtures, in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard CMS electronic setup are under test. In this paper preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze, with CO 2 and CF 3 I based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  9. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    CERN Document Server

    Abbrescia, M.

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require Fluorine based (F-based) gases for optimal performance. Recent European regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade, several tests are ongoing to measure the performance of the detector with these new ecological gas mixtures, in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard CMS electronic setup are under test. In this paper preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze, with CO2 and CF3I based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  10. Preliminary Validation of a New Measure of Negative Response Bias: The Temporal Memory Sequence Test.

    Science.gov (United States)

    Hegedish, Omer; Kivilis, Naama; Hoofien, Dan

    2015-01-01

    The Temporal Memory Sequence Test (TMST) is a new measure of negative response bias (NRB) that was developed to enrich the forced-choice paradigm. The TMST does not resemble the common structure of forced-choice tests and is presented as a temporal recall memory test. The validation sample consisted of 81 participants: 21 healthy control participants, 20 coached simulators, and 40 patients with acquired brain injury (ABI). The TMST had high reliability and significantly high positive correlations with the Test of Memory Malingering and Word Memory Test effort scales. Moreover, the TMST effort scales exhibited high negative correlations with the Glasgow Coma Scale, thus validating the previously reported association between probable malingering and mild traumatic brain injury. A suggested cutoff score yielded acceptable classification rates in the ABI group as well as in the simulator and control groups. The TMST appears to be a promising measure of NRB detection, with respectable rates of reliability and construct and criterion validity.

  11. Probability of a surface rupture offset beneath a nuclear test reactor

    International Nuclear Information System (INIS)

    Reed, J.W.; Meehan, R.L.; Crellin, G.L.

    1981-01-01

    A probabilistic analysis was conducted to determine the likelihood of a surface rupture offset of any size beneath the 50 megawatt General Electric Test Reactor (GETR), which is located at the Vallecitos Nuclear Center near Pleasanton, California. Geologic faults have been observed at the GETR site. These faults may be due to surface folds, landslides, or deep tectonic movement. They are referred to in the paper as 'existing faults;' however, use of this term does not imply that they are tectonic in origin. The objective of the analysis was to evaluate whether a conservative estimate of the probability of occurrence of a future fault movement is sufficiently low so that movement beneath the reactor building need not be considered as a design basis event. The reactor building is located between two existing faults which are approximately 1320 feet apart. If a fault movement occurs in the future, it is conservatively assumed to occur either on the existing faults or between the faults, or on a fault(s) and between the two faults at the same time. The probabilistic model included the possibility of movements occurring due to unknown, undiscovered faults in the region. For this part, movements were assumed to occur according to a Poisson process. For the possibility of new faults occurring due to the two existing faults, a hazard function was used which increases with time since the last offset. (orig./RW)

  12. Predicting non-square 2D dice probabilities

    Science.gov (United States)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  13. Preliminary safety evaluation (PSE) for Sodium Storage Facility at the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Bowman, B.R.

    1994-01-01

    This evaluation was performed for the Sodium Storage Facility (SSF) which will be constructed at the Fast Flux Test Facility (FFTF) in the area adjacent to the South and West Dump Heat Exchanger (DHX) pits. The purpose of the facility is to allow unloading the sodium from the FFTF plant tanks and piping. The significant conclusion of this Preliminary Safety Evaluation (PSE) is that the only Safety Class 2 components are the four sodium storage tanks and their foundations. The building, because of its imminent risk to the tanks under an earthquake or high winds, will be Safety Class 3/2, which means the building has a Safety Class 3 function with the Safety Class 2 loads of seismic and wind factored into the design

  14. The preliminary tests of the superconducting electron cyclotron resonance ion source DECRIS-SC2.

    Science.gov (United States)

    Efremov, A; Bekhterev, V; Bogomolov, S; Drobin, V; Loginov, V; Lebedev, A; Yazvitsky, N; Yakovlev, B

    2012-02-01

    A new compact version of the "liquid He-free" superconducting ECR ion source, to be used as an injector of highly charged heavy ions for the MC-400 cyclotron, is designed and built at the Flerov Laboratory of Nuclear Reactions in collaboration with the Laboratory of High Energy Physics of JINR. The axial magnetic field of the source is created by the superconducting magnet and the NdFeB hexapole is used for the radial plasma confinement. The microwave frequency of 14 GHz is used for ECR plasma heating. During the first tests, the source shows a good enough performance for the production of medium charge state ions. In this paper, we will present the design parameters and the preliminary results with gaseous ions.

  15. Individual variation in social aggression and the probability of inheritance: theory and a field test.

    Science.gov (United States)

    Cant, Michael A; Llop, Justine B; Field, Jeremy

    2006-06-01

    Recent theory suggests that much of the wide variation in individual behavior that exists within cooperative animal societies can be explained by variation in the future direct component of fitness, or the probability of inheritance. Here we develop two models to explore the effect of variation in future fitness on social aggression. The models predict that rates of aggression will be highest toward the front of the queue to inherit and will be higher in larger, more productive groups. A third prediction is that, in seasonal animals, aggression will increase as the time available to inherit the breeding position runs out. We tested these predictions using a model social species, the paper wasp Polistes dominulus. We found that rates of both aggressive "displays" (aimed at individuals of lower rank) and aggressive "tests" (aimed at individuals of higher rank) decreased down the hierarchy, as predicted by our models. The only other significant factor affecting aggression rates was date, with more aggression observed later in the season, also as predicted. Variation in future fitness due to inheritance rank is the hidden factor accounting for much of the variation in aggressiveness among apparently equivalent individuals in this species.

  16. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  17. Cell emulation and preliminary results.

    Science.gov (United States)

    2016-07-01

    This report details preliminary results of the testing plan implemented by the Hawaii Natural Energy Institute to evaluate Electric Vehicle (EV) battery durability and reliability under electric utility grid operations. Commercial EV battery cells ar...

  18. AGR-2 Irradiated Test Train Preliminary Inspection and Disassembly First Look

    Energy Technology Data Exchange (ETDEWEB)

    Ploger, Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States); Demkowciz, Paul [Idaho National Lab. (INL), Idaho Falls, ID (United States); Harp, Jason [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    The AGR 2 irradiation experiment began in June 2010 and was completed in October 2013. The test train was shipped to the Materials and Fuels Complex in July 2014 for post-irradiation examination (PIE). The first PIE activities included nondestructive examination of the test train, followed by disassembly of the test train and individual capsules and detailed inspection of the capsule contents, including the fuel compacts and their graphite fuel holders. Dimensional metrology was then performed on the compacts, graphite holders, and steel capsule shells. AGR 2 disassembly and metrology were performed with the same equipment used successfully on AGR 1 test train components. Gamma spectrometry of the intact test train gave a preliminary look at the condition of the interior components. No evidence of damage to compacts or graphite components was evident from the isotopic and gross gamma scans. Disassembly of the AGR 2 test train and its capsules was conducted rapidly and efficiently by employing techniques refined during the AGR 1 disassembly campaign. Only one major difficulty was encountered while separating the test train into capsules when thermocouples (of larger diameter than used in AGR 1) and gas lines jammed inside the through tubes of the upper capsules, which required new tooling for extraction. Disassembly of individual capsules was straightforward with only a few minor complications. On the whole, AGR 2 capsule structural components appeared less embrittled than their AGR 1 counterparts. Compacts from AGR 2 Capsules 2, 3, 5, and 6 were in very good condition upon removal. Only relatively minor damage or markings were visible using high resolution photographic inspection. Compact dimensional measurements indicated radial shrinkage between 0.8 to 1.7%, with the greatest shrinkage observed on Capsule 2 compacts that were irradiated at higher temperature. Length shrinkage ranged from 0.1 to 0.9%, with by far the lowest axial shrinkage on Capsule 3 compacts

  19. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  20. Development and preliminary testing of a computerized Animated Activity Questionnaire (AAQ) in patients with hip and knee osteoarthritis

    DEFF Research Database (Denmark)

    Peter, Wf; Loos, M; de Vet, Hcw

    2015-01-01

    , and to preliminary assess its reliability and validity. We hypothesize that the AAQ correlates highly with performance-based tests, and moderately with self-reports. Methods Item selection was based on 1) the pilot AAQ; 2) pre-specified conditions; 3) the International Classification of Functioning core set for OA......Objective To develop an Animated Activity Questionnaire (AAQ), based on video animations, for assessing activity limitations in patients with hip/knee osteoarthritis (OA), which combines the advantages of self-reported questionnaires and performance-based tests, without many of their limitations......, 4) existing measurement instruments, and 5) focus groups of patients. Test-retest reliability was assessed in 30/110 patients. In 110 patients correlations were calculated between AAQ and self-reported Hip disability and Knee injury Osteoarthritis Outcome ADL subscale (H/KOOS). In 45/110 patients...

  1. Construction of PREMUX and preliminary experimental results, as preparation for the HCPB breeder unit mock-up testing

    Energy Technology Data Exchange (ETDEWEB)

    Hernández, F., E-mail: francisco.hernandez@kit.edu [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany); Kolb, M. [Karlsruhe Institute of Technology (KIT), Institute for Applied Materials (IAM-WPT) (Germany); Annabattula, R. [Indian Institute of Technology Madras (IITM), Department of Mechanical Engineering (India); Weth, A. von der [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany)

    2014-10-15

    Highlights: • PREMUX has been constructed as preparation for a future out-of-pile thermo-mechanical qualification of a HCPB breeder unit mock-up. • The rationale and constructive details of PREMUX are reported in this paper. • PREMUX serves as a test rig for the new heater system developed for the HCPB-BU mock-up. • PREMUX will be used as benchmark for the thermal and thermo-mechanical models developed in ANSYS for the pebble beds of the HCPB-BU. • Preliminary results show the functionality of PREMUX and the good agreement of the measured temperatures with the thermal model developed in ANSYS. - Abstract: One of the European blanket designs for ITER is the Helium Cooled Pebble Bed (HCPB) blanket. The core of the HCPB-TBM consists of so-called breeder units (BUs), which encloses beryllium as neutron multiplier and lithium orthosilicate (Li{sub 4}SiO{sub 4}) as tritium breeder in form of pebble beds. After the design phase of the HCPB-BU, a non-nuclear thermal and thermo-mechanical qualification program for this device is running at the Karlsruhe Institute of Technology. Before the complex full scale BU testing, a pre-test mock-up experiment (PREMUX) has been constructed, which consists of a slice of the BU containing the Li{sub 4}SiO{sub 4} pebble bed. PREMUX is going to be operated under highly ITER-relevant conditions and has the following goals: (1) as a testing rig of new heater concept based on a matrix of wire heaters, (2) as benchmark for the existing finite element method (FEM) codes used for the thermo-mechanical assessment of the Li{sub 4}SiO{sub 4} pebble bed, and (3) in situ measurement of thermal conductivity of the Li{sub 4}SiO{sub 4} pebble bed during the tests. This paper describes the construction of PREMUX, its rationale and the experimental campaign planned with the device. Preliminary results testing the algorithm used for the temperature reconstruction of the pebble bed are reported and compared qualitatively with first analyses

  2. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  3. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  4. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    Science.gov (United States)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  5. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  6. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  7. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  8. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  9. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  10. Modeling and preliminary thermal analysis of the capsule for a creep test in HANARO

    International Nuclear Information System (INIS)

    Choi, Myoung Hwan; Cho, Man Soon; Choo, Kee Nam; Kang, Young Hwan; Sohn, Jae Min; Shin, Yoon Taeg; Park, Sung Jae; Kim, Bong Goo; Kim, Young Jin

    2005-01-01

    A creep capsule is a device to investigate the creep characteristics of nuclear materials during inpile irradiation tests. To obtain the design data of the capsule through a preliminary thermal analysis, a 2-dimensional model for the cross section of the capsule including the specimens and components is generated, and an analysis using the ANSYS program is performed. The gamma-heating rates of the materials for the HANARO power of 30MW are considered, and the effect of the gap size and the control rod position on the temperature of the specimen is discussed. From the analysis it is found that the gap between the thermal media and the external tube has a significant effect on the temperature of the specimen. The temperature by increasing the position of the control rod is decreased

  11. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A preliminary study in osteoinduction by a nano-crystalline hydroxyapatite in the mini pig.

    Directory of Open Access Journals (Sweden)

    Karsten K H Gundlach

    2011-04-01

    Full Text Available To test the probable osteoinductive properties of NanoBone, a new highly non-sintered porous nano-crystalline hydroxylapatite bone substitute embedded into a silica gel matrix, granules were implanted subcutaneously and intramuscularly into the back region of 18 mini pigs. After periods of 5 and 10 weeks as well as 4 and 8 months, implantation sites were investigated using histological and histomorphometric procedures. Signs of early osteogenesis could already be detected after 5 weeks. The later periods were characterized by increasing membranous osteogenesis in and around the granules leading to the formation of bone-like structures showing periosteal and tendon-like structures with bone marrow and focal chondrogenesis. Bone formation was better in the subcutaneous than in the intramuscular implantation sites. This ectopic osteogenesis is discussed with regard to the nanoporosity and microporosity of the material, physico-chemical interactions at its surface, the differentiation of osteoblasts, the role of angiogenesis and the probable involvement of growth factors. The results of this preliminary study indicate that this biomaterial has osteoinductive potential and induces the formation of bone structures, mainly in subcutaneous adipose tissue in the pig.

  13. A preliminary study in osteoinduction by a nano-crystalline hydroxyapatite in the mini pig.

    Science.gov (United States)

    Götz, Werner; Lenz, Solvig; Reichert, Christoph; Henkel, Kai-Olaf; Bienengräber, Volker; Pernicka, Laura; Gundlach, Karsten K H; Gredes, Tomasz; Gerber, Thomas; Gedrange, Tomasz; Heinemann, Friedhelm

    2010-12-01

    To test the probable osteoinductive properties of NanoBone, a new highly non-sintered porous nano-crystalline hydroxylapatite bone substitute embedded into a silica gel matrix, granules were implanted subcutaneously and intramuscularly into the back region of 18 mini pigs. After periods of 5 and 10 weeks as well as 4 and 8 months, implantation sites were investigated using histological and histomorphometric procedures. Signs of early osteogenesis could already be detected after 5 weeks. The later periods were characterized by increasing membranous osteogenesis in and around the granules leading to the formation of bone-like structures showing periosteal and tendon-like structures with bone marrow and focal chondrogenesis. Bone formation was better in the subcutaneous than in the intramuscular implantation sites. This ectopic osteogenesis is discussed with regard to the nanoporosity and microporosity of the material, physico-chemical interactions at its surface, the differentiation of osteoblasts, the role of angiogenesis and the probable involvement of growth factors. The results of this preliminary study indicate that this biomaterial has osteoinductive potential and induces the formation of bone structures, mainly in subcutaneous adipose tissue in the pig.

  14. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  15. Analyses of moments in pseudorapidity intervals at √s = 546 GeV by means of two probability distributions in pure-birth process

    International Nuclear Information System (INIS)

    Biyajima, M.; Shirane, K.; Suzuki, N.

    1988-01-01

    Moments in pseudorapidity intervals at the CERN Sp-barpS collider (√s = 546 GeV) are analyzed by means of two probability distributions in the pure-birth stochastic process. Our results show that a probability distribution obtained from the Poisson distribution as an initial condition is more useful than that obtained from the Kronecker δ function. Analyses of moments by Koba-Nielsen-Olesen scaling functions derived from solutions of the pure-birth stochastic process are also made. Moreover, analyses of preliminary data at √s = 200 and 900 GeV are added

  16. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  17. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  18. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  19. Preliminary Tests in the NACA Tank to Investigate the Fundamental Characteristics of Hydrofoils

    Science.gov (United States)

    Ward, Kenneth E.; Land, Norman S.

    1940-01-01

    This preliminary investigation was made to study the hydrodynamic properties and general behavior of simple hydrofoils. Six 5- by 30-inch plain, rectangular hydrofoils were tested in the NACA tank at various speeds, angles of attack and depths below the water surface. Two of the hydrofoils had sections representing the sections of commonly used airfoils, one had a section similar to one developed Guidoni for use with hydrofoil-equipped seaplane floats, and three had sections designed to have constant chordwise pressure distributions at given values of the lift coefficient for the purpose of delaying the speed at which cavitation begins. The experimental results are presented as curves of the lift and drag coefficients plotted against speed for the various angles of attack and depths for which the hydrofoils were tested. A number of derived curves are included for the purpose of better comparing the characteristics of the hydrofoils and to show the effects of depth. Several representative photographs show the development of cavitation on the the upper surface of the hydrofoils. The results indicate that properly designed hydrofoil sections will have excellent characteristics and that the speed at which cavitation occurs may be delayed to an appreciable extent by the use of suitable sections.

  20. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  2. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  3. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  4. Gaseous electron multiplier-based soft x-ray plasma diagnostics development: Preliminary tests at ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Chernyshova, M., E-mail: maryna.chernyshova@ipplm.pl; Malinowski, K.; Czarski, T.; Kowalska-Strzęciwilk, E. [Institute of Plasma Physics and Laser Microfusion, Hery 23, 01-497 Warsaw (Poland); Wojeński, A.; Poźniak, K. T.; Kasprowicz, G.; Krawczyk, R.; Kolasiński, P.; Zabołotny, W.; Zienkiewicz, P. [Institute of Electronic Systems, Warsaw University of Technology, Nowowiejska 15/19, 00-665 Warsaw (Poland); Vezinet, D.; Herrmann, A. [Max Planck Institute for Plasma Physics, Boltzmannstr. 2, 85748 Garching (Germany); Mazon, D.; Jardin, A. [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France)

    2016-11-15

    A Gaseous Electron Multiplier (GEM)-based detector is being developed for soft X-ray diagnostics on tokamaks. Its main goal is to facilitate transport studies of impurities like tungsten. Such studies are very relevant to ITER, where the excessive accumulation of impurities in the plasma core should be avoided. This contribution provides details of the preliminary tests at ASDEX Upgrade (AUG) with a focus on the most important aspects for detector operation in harsh radiation environment. It was shown that both spatially and spectrally resolved data could be collected, in a reasonable agreement with other AUG diagnostics. Contributions to the GEM signal include also hard X-rays, gammas, and neutrons. First simulations of the effect of high-energy photons have helped understanding these contributions.

  5. Preliminary code development for seismic signal analysis related to test ban treaty questions

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1977-01-01

    Forensic seismology, from a present day viewpoint, appears to be divided into several areas. Overwhelmingly important, in view of current Complete Test Ban (CTB) discussions, is the seismological study of waves generated in the earth by underground nuclear explosions. Over the last two decades intensive effort has been devoted to developing improved observational apparatus and to the interpretation of the data produced by this equipment. It is clearly desirable to extract the maximum amount of information from seismic signals. It is, therefore, necessary to quantitatively compare various modes of analysis to establish which mode or combination of modes provides the most useful information. Preliminary code development for application of some modern developments in signal processing to seismic signals is described. Applications of noncircular functions are considered and compared with circular function results. The second portion of the discussion concerns maximum entropy analysis. Lastly, the multivariate aspects of the general problem are considered

  6. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    Science.gov (United States)

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Preliminary HECTOR analysis by Dragon

    Energy Technology Data Exchange (ETDEWEB)

    Presser, W; Woloch, F

    1972-06-02

    From the different cores measured in HECTOR, only ACH 4/B-B was selected for the Dragon analysis, since it presented the largest amount of uniform fuel loading in the central test region and is therefore nearest to an infinite lattice. Preliminary results are discussed.

  8. Web-based experiments controlled by JavaScript: an example from probability learning.

    Science.gov (United States)

    Birnbaum, Michael H; Wakcher, Sandra V

    2002-05-01

    JavaScript programs can be used to control Web experiments. This technique is illustrated by an experiment that tested the effects of advice on performance in the classic probability-learning paradigm. Previous research reported that people tested via the Web or in the lab tended to match the probabilities of their responses to the probabilities that those responses would be reinforced. The optimal strategy, however, is to consistently choose the more frequent event; probability matching produces suboptimal performance. We investigated manipulations we reasoned should improve performance. A horse race scenario in which participants predicted the winner in each of a series of races between two horses was compared with an abstract scenario used previously. Ten groups of learners received different amounts of advice, including all combinations of (1) explicit instructions concerning the optimal strategy, (2) explicit instructions concerning a monetary sum to maximize, and (3) accurate information concerning the probabilities of events. The results showed minimal effects of horse race versus abstract scenario. Both advice concerning the optimal strategy and probability information contributed significantly to performance in the task. This paper includes a brief tutorial on JavaScript, explaining with simple examples how to assemble a browser-based experiment.

  9. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  10. Site study plan for Deep Hydronest Test Wells, Deaf Smith County Site, Texas: Preliminary draft

    International Nuclear Information System (INIS)

    1987-05-01

    Wells called Deep Hydronest Wells will be installed at six locations at the Deaf Smith County Site to characterize hydraulic parameters in the geologic column between the top of the San Andres Formation and the base of Pennsylvanian System. Three hydronests will be drilled during early stages of site characterization to provide data for performance assessment modeling. Four wells are proposed for each of these 3 nests. Results of drilling, testing, and preliminary modeling will direct drilling and testing activities at the last 3 nests. Two wells are proposed at each of the last 3 nests for a total of 18 wells. The Salt Repository Project (SRP) Networks specify the schedule under which this program will operate. Drilling and hydrologic testing of the first Deep Hydronest will begin early in the Surface Investigation Program. Drilling and testing of the first three Deep Hydronests will require about 18 months. After 12 months of evaluating and analyzing data from the first three hydronests, the remaining three hydronests will be drilled during a 12-month period. The Technical Field Services Contractor is responsible for conducting the field program. Samples and data will be handled and reported in accordance with established SRP procedures. A quality assurance program will be used to assure that activities affecting quality are performed correctly and that the appropriate documentation is maintained. 36 refs., 20 figs., 6 tabs

  11. In Situ Vitrification preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-01-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: 1) determine large-scale processing performance and 2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. The LSRT was completed in June 1987 after 295 hours of operation and 460 MWh of electrical energy dissipated to the molten soil. This resulted in a minimum of a 450-t block of vitrified soil extending to a depth of 7.3m (24 ft). The primary contaminants vitrified during the demonstration were Pu and Am transuranics, but also included up to 26,000 ppm fluorides. Preliminary data show that their retention in the vitrified product exceeded predictions meaning that fewer contaminants needed to be removed from the gaseous effluents by the processing equipment. The gaseous effluents were contained and treated throughout the run; that is, no radioactive or hazardous chemical releases were detected

  12. Bayesian probability theory applications in the physical sciences

    CERN Document Server

    Linden, Wolfgang von der; Toussaint, Udo von

    2014-01-01

    From the basics to the forefront of modern research, this book presents all aspects of probability theory, statistics and data analysis from a Bayesian perspective for physicists and engineers. The book presents the roots, applications and numerical implementation of probability theory, and covers advanced topics such as maximum entropy distributions, stochastic processes, parameter estimation, model selection, hypothesis testing and experimental design. In addition, it explores state-of-the art numerical techniques required to solve demanding real-world problems. The book is ideal for students and researchers in physical sciences and engineering.

  13. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  14. Preliminary design of GDT-based 14 MeV neutron source

    International Nuclear Information System (INIS)

    Du Hongfei; Chen Dehong; Wang Hui; Wang Fuqiong; Jiang Jieqiong; Wu Yican; Chen Yiping

    2012-01-01

    To meet the need of D-T fusion neutron source for fusion material testing, design goals were presented in this paper according to the international requirements of neutron source for fusion material testing. A preliminary design scheme of GDT-based 14 MeV neutron source was proposed, and a physics model of the neutron source was built based on progress of GDT experiments. Two preliminary design schemes (i. e. FDS-GDT1, FDS-GDT2) were designed; among which FDS-GDT2 can be used for fusion material testing with neutron first wall loading of 2 MW/m 2 . (authors)

  15. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. The neutral beam test facility cryopumping operation: preliminary analysis and design of the cryogenic system

    International Nuclear Information System (INIS)

    Gravil, B.; Henry, D.; Cordier, J.J.; Hemsworth, R.; Van Houtte, D.

    2004-01-01

    The ITER neutral beam heating and current drive system is to be equipped with a cryosorption cryopump made up of 12 panels connected in parallel, refrigerated by 4.5 K 0.4 MPa supercritical helium. The pump is submitted to a non homogeneous flux of H 2 or D 2 molecules, and the absorbed flux varies from 3 Pa.m -3 .s -1 to 35 Pa.m -3 .s -1 . In the frame of the 'ITER first injector and test facility CSU-EFDA task' (TW3-THHN-IITF1), the ITER reference cryo-system and cryo-plant designs have been assessed and compared to optimised designs devoted to the Neutral Beam Test Facility (NBTF). The 4.5 K cryo-panel, which has a mass of about 1000 kg, must be periodically regenerated up to 90 K and occasionally to 470 K. The cool-down time after regeneration depends strongly on the refrigeration capacity. Fast regeneration and cool-down of the cryo-panels are not considered a priority for the test facility operation, and an analysis of the consequences of a limited cold power refrigerator on the cooling down time has been carried out and will be discussed. This paper presents a preliminary evaluation of the NBTF cryo-plant and the associated process flow diagram. (authors)

  17. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  18. Probability-density-function characterization of multipartite entanglement

    International Nuclear Information System (INIS)

    Facchi, P.; Florio, G.; Pascazio, S.

    2006-01-01

    We propose a method to characterize and quantify multipartite entanglement for pure states. The method hinges upon the study of the probability density function of bipartite entanglement and is tested on an ensemble of qubits in a variety of situations. This characterization is also compared to several measures of multipartite entanglement

  19. Serologic test systems development. Progress report, October 1, 1978-September 30, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Seawright, G.L.; Sanders, W.M.; Hollstein, U.; Butler, J.E.; Mills, K.W.; Despommier, D.D.; Zimmerman, W.J.; Martinez, E.; Hindman, K.R.; Payne, R.J.

    1980-12-01

    Work has continued on the development and automation of enzyme immunoassays (EIA) for detecting diseases and toxic agents in food animals. Further evaluations were made of the Technicon Autoanalyzer II(AAII) for conducting totally automated EIAs. The problems investigated were machine carryover and assay variation. Modifications greatly reduced or eliminated carryover and produced acceptable levels of test variation. The EIA for swine trichinosis was significantly improved by the use of a new, partially purified antigen preparation. The result was improved detection of early seroconversions and reduced probability for false negatives and false positives. The amplified EIA was adapted as a diagnostic test for bovine brucellosis and studies were initiated for differentiating vaccinated and infected animals. Preliminary data indicate that the IgG/sub 1/ response may be diagnostic but further studies are necessary. Development of the EIA for detecting low molecular weight contaminants and residues in food products was also initiated. Compounds studied were the antibiotics chloramphenicol, tetracycline, and gentamicin; the mycotoxin, aflatoxin, and the shale oil toxin, 2-aminofluorene. Results indicate that chloramphenicol nonspecifically binds to antibody and interferes with antibody activity. Thus, the test is not yet satisfactory. Initial attempts to automate the gentamicin test were unsuccessful because of machine carryover but modifications of the AAII have produced encouraging preliminary data. Work is continuing on the development of EIAs for all of the compounds mentioned above. (ERB)

  20. Use of probabilistic methods for estimating failure probabilities and directing ISI-efforts

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, F; Brickstad, B [University of Uppsala, (Switzerland)

    1988-12-31

    Some general aspects of the role of Non Destructive Testing (NDT) efforts on the resulting probability of core damage is discussed. A simple model for the estimation of the pipe break probability due to IGSCC is discussed. It is partly based on analytical procedures, partly on service experience from the Swedish BWR program. Estimates of the break probabilities indicate that further studies are urgently needed. It is found that the uncertainties about the initial crack configuration are large contributors to the total uncertainty. Some effects of the inservice inspection are studied and it is found that the detection probabilities influence the failure probabilities. (authors).

  1. Evaluation of Correlation between Pretest Probability for Clostridium difficile Infection and Clostridium difficile Enzyme Immunoassay Results.

    Science.gov (United States)

    Kwon, Jennie H; Reske, Kimberly A; Hink, Tiffany; Burnham, C A; Dubberke, Erik R

    2017-02-01

    The objective of this study was to evaluate the clinical characteristics and outcomes of hospitalized patients tested for Clostridium difficile and determine the correlation between pretest probability for C. difficile infection (CDI) and assay results. Patients with testing ordered for C. difficile were enrolled and assigned a high, medium, or low pretest probability of CDI based on clinical evaluation, laboratory, and imaging results. Stool was tested for C. difficile by toxin enzyme immunoassay (EIA) and toxigenic culture (TC). Chi-square analyses and the log rank test were utilized. Among the 111 patients enrolled, stool samples from nine were TC positive and four were EIA positive. Sixty-one (55%) patients had clinically significant diarrhea, 19 (17%) patients did not, and clinically significant diarrhea could not be determined for 31 (28%) patients. Seventy-two (65%) patients were assessed as having a low pretest probability of having CDI, 34 (31%) as having a medium probability, and 5 (5%) as having a high probability. None of the patients with low pretest probabilities had a positive EIA, but four were TC positive. None of the seven patients with a positive TC but a negative index EIA developed CDI within 30 days after the index test or died within 90 days after the index toxin EIA date. Pretest probability for CDI should be considered prior to ordering C. difficile testing and must be taken into account when interpreting test results. CDI is a clinical diagnosis supported by laboratory data, and the detection of toxigenic C. difficile in stool does not necessarily confirm the diagnosis of CDI. Copyright © 2017 American Society for Microbiology.

  2. Fractal supersymmetric QM, Geometric Probability and the Riemann Hypothesis

    CERN Document Server

    Castro, C

    2004-01-01

    The Riemann's hypothesis (RH) states that the nontrivial zeros of the Riemann zeta-function are of the form $ s_n =1/2+i\\lambda_n $. Earlier work on the RH based on supersymmetric QM, whose potential was related to the Gauss-Jacobi theta series, allows to provide the proper framework to construct the well defined algorithm to compute the probability to find a zero (an infinity of zeros) in the critical line. Geometric probability theory furnishes the answer to the very difficult question whether the probability that the RH is true is indeed equal to unity or not. To test the validity of this geometric probabilistic framework to compute the probability if the RH is true, we apply it directly to the the hyperbolic sine function $ \\sinh (s) $ case which obeys a trivial analog of the RH (the HSRH). Its zeros are equally spaced in the imaginary axis $ s_n = 0 + i n \\pi $. The geometric probability to find a zero (and an infinity of zeros) in the imaginary axis is exactly unity. We proceed with a fractal supersymme...

  3. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  4. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  5. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    Science.gov (United States)

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  6. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  7. Obtaining 64Cu in a nuclear reactor from a Zn matrix: Preliminary tests

    International Nuclear Information System (INIS)

    Aguirre, Andrea; Bedregal, Patricia; Montoya, Eduardo; Cohen, Marcos Isaac

    2014-01-01

    The design and feasibility of a method for obtaining 6C u in a nuclear reactor, from the 64 Zn(n,p) 64 Cu threshold reaction of zinc, induced by the fast component of the neutron spectrum, is presented. The product obtained will be used in positron emission tomography (PET). The preliminary experiments were performed using the RP-10 research reactor at a power of 3.5 MW, followed by a radiochemical separation by solvent extraction using a chloroform solution of dithizone. The radioisotope has been identified and quantified through the full energy peak of 1345.77 keV, using a high resolution gamma spectrometry system. The preliminary yield achieved demonstrates the feasibility of the proposed method. (authors).

  8. Preliminary design report for the NAC combined transport cask

    International Nuclear Information System (INIS)

    1990-04-01

    Nuclear Assurance Corporation (NAC) is under contract to the United States Department of Energy (DOE) to design, license, develop and test models, and fabricate a prototype cask transportation system for nuclear spent fuel. The design of this combined transport (rail/barge) transportation system has been divided into two phases, a preliminary design phase and a final design phase. This Preliminary Design Package (PDP) describes the NAC Combined Transport Cask (NAC-CTC), the results of work completed during the preliminary design phase and identifies the additional detailed analyses, which will be performed during final design. Preliminary analytical results are presented in the appropriate sections and supplemented by summaries of procedures and assumptions for performing the additional detailed analyses of the final design. 60 refs., 1 fig., 2 tabs

  9. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. Preliminary test of an imaging probe for nuclear medicine using hybrid pixel detectors

    International Nuclear Information System (INIS)

    Bertolucci, E.; Maiorino, M.; Mettivier, G.; Montesi, M.C.; Russo, P.

    2002-01-01

    We are investigating the feasibility of an intraoperative imaging probe for lymphoscintigraphy with Tc-99m tracer, for sentinel node radioguided surgery, using the Medipix series of hybrid detectors coupled to a collimator. These detectors are pixelated semiconductor detectors bump-bonded to the Medipix1 photon counting read-out chip (64x64 pixel, 170 μm pitch) or to the Medipix2 chip (256x256 pixel, 55 μm pitch), developed by the European Medipix collaboration. The pixel detector we plan to use in the final version of the probe is a semi-insulating GaAs detector or a 1-2 mm thick CdZnTe detector. For the preliminary tests presented here, we used 300-μm thick silicon detectors, hybridized via bump-bonding to the Medipix1 chip. We used a tungsten parallel-hole collimator (7 mm thick, matrix array of 64x64 100 μm circular holes with 170 μm pitch), and a 22, 60 and 122 keV point-like (1 mm diameter) radioactive sources, placed at various distances from the detector. These tests were conducted in order to investigate the general feasibility of this imaging probe and its resolving power. Measurements show the high resolution but low efficiency performance of the detector-collimator set, which is able to image the 122 keV source with <1 mm FWHM resolution

  11. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  12. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    Lo, T.; Woo, H.H.; Holman, G.S.; Chou, C.K.

    1984-02-01

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  13. Preliminary Results From a Heavily Instrumented Engine Ice Crystal Icing Test in a Ground Based Altitude Test Facility

    Science.gov (United States)

    Flegel, Ashlie B.; Oliver, Michael J.

    2016-01-01

    Preliminary results from the heavily instrumented ALF502R-5 engine test conducted in the NASA Glenn Research Center Propulsion Systems Laboratory are discussed. The effects of ice crystal icing on a full scale engine is examined and documented. This same model engine, serial number LF01, was used during the inaugural icing test in the Propulsion Systems Laboratory facility. The uncommanded reduction of thrust (rollback) events experienced by this engine in flight were simulated in the facility. Limited instrumentation was used to detect icing on the LF01 engine. Metal temperatures on the exit guide vanes and outer shroud and the load measurement were the only indicators of ice formation. The current study features a similar engine, serial number LF11, which is instrumented to characterize the cloud entering the engine, detect/characterize ice accretion, and visualize the ice accretion in the region of interest. Data were acquired at key LF01 test points and additional points that explored: icing threshold regions, low altitude, high altitude, spinner heat effects, and the influence of varying the facility and engine parameters. For each condition of interest, data were obtained from some selected variations of ice particle median volumetric diameter, total water content, fan speed, and ambient temperature. For several cases the NASA in-house engine icing risk assessment code was used to find conditions that would lead to a rollback event. This study further helped NASA develop necessary icing diagnostic instrumentation, expand the capabilities of the Propulsion Systems Laboratory, and generate a dataset that will be used to develop and validate in-house icing prediction and risk mitigation computational tools. The ice accretion on the outer shroud region was acquired by internal video cameras. The heavily instrumented engine showed good repeatability of icing responses when compared to the key LF01 test points and during day-to-day operation. Other noticeable

  14. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  15. Probability of initiation and extinction in the Mercury Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    McKinley, M. S.; Brantley, P. S. [Lawrence Livermore National Laboratory, 7000 East Ave., Livermore, CA 94551 (United States)

    2013-07-01

    A Monte Carlo method for computing the probability of initiation has previously been implemented in Mercury. Recently, a new method based on the probability of extinction has been implemented as well. The methods have similarities from counting progeny to cycling in time, but they also have differences such as population control and statistical uncertainty reporting. The two methods agree very well for several test problems. Since each method has advantages and disadvantages, we currently recommend that both methods are used to compute the probability of criticality. (authors)

  16. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  17. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. Method of boundary testing of the electric circuits and its application for calculating electric tolerances. [electric equipment tests

    Science.gov (United States)

    Redkina, N. P.

    1974-01-01

    Boundary testing of electric circuits includes preliminary and limiting tests. Preliminary tests permit determination of the critical parameters causing the greatest deviation of the output parameter of the system. The boundary tests offer the possibility of determining the limits of the fitness of the system with simultaneous variation of its critical parameters.

  20. SmallWorld Behavior of the Worldwide Active Volcanoes Network: Preliminary Results

    Science.gov (United States)

    Spata, A.; Bonforte, A.; Nunnari, G.; Puglisi, G.

    2009-12-01

    We propose a preliminary complex networks based approach in order to model and characterize volcanoes activity correlation observed on a planetary scale over the last two thousand years. Worldwide volcanic activity is in fact related to the general plate tectonics that locally drives the faults activity, that in turn controls the magma upraise beneath the volcanoes. To find correlations among different volcanoes could indicate a common underlying mechanism driving their activity and could help us interpreting the deeper common dynamics controlling their unrest. All the first evidences found testing the procedure, suggest the suitability of this analysis to investigate global volcanism related to plate tectonics. The first correlations found, in fact, indicate that an underlying common large-scale dynamics seems to drive volcanic activity at least around the Pacific plate, where it collides and subduces beneath American, Eurasian and Australian plates. From this still preliminary analysis, also more complex relationships among volcanoes lying on different tectonic margins have been found, suggesting some more complex interrelationships between different plates. The understanding of eventually detected correlations could be also used to further implement warning systems, relating the unrest probabilities of a specific volcano also to the ongoing activity to the correlated ones. Our preliminary results suggest that, as for other many physical and biological systems, an underlying organizing principle of planetary volcanoes activity might exist and it could be a small-world principle. In fact we found that, from a topological perspective, volcanoes correlations are characterized by the typical features of small-world network: a high clustering coefficient and a low characteristic path length. These features confirm that global volcanoes activity is characterized by both short and long-range correlations. We stress here the fact that numerical simulation carried out in

  1. Preliminary Validation of Composite Material Constitutive Characterization

    Science.gov (United States)

    John G. Michopoulos; Athanasios lliopoulos; John C. Hermanson; Adrian C. Orifici; Rodney S. Thomson

    2012-01-01

    This paper is describing the preliminary results of an effort to validate a methodology developed for composite material constitutive characterization. This methodology involves using massive amounts of data produced from multiaxially tested coupons via a 6-DoF robotic system called NRL66.3 developed at the Naval Research Laboratory. The testing is followed by...

  2. Using a situational judgement test for selection into dental core training: a preliminary analysis.

    Science.gov (United States)

    Rowett, E; Patterson, F; Cousans, F; Elley, K

    2017-05-12

    Objective and setting This paper describes the evaluation of a pilot situational judgement test (SJT) for selection into UK Dental Core Training (DCT). The SJT's psychometric properties, group differences based on gender and ethnicity, and candidate reactions were assessed.Methods The SJT targets four non-academic attributes important for success in DCT. Data were collected alongside live selection processes from five Health Education England local teams in the UK (N = 386). Candidates completed the pilot SJT and an evaluation questionnaire to examine their reactions to the test.Results SJT scores were relatively normally distributed and showed acceptable levels of internal reliability (α = 0.68). Difficulty level and partial correlations between scenarios and SJT total score were in the expected ranges (64.61% to 90.03% and r = 0.06 to 0.41, respectively). No group differences were found for gender, and group differences between White and BME candidates were minimal. Most candidates perceived the SJT as relevant to the target role, appropriate and fair.Conclusions This study demonstrated the potential suitability of an SJT for use in DCT selection. Future research should replicate these preliminary findings in other cohorts, and assess the predictive validity of the SJT for predicting key training and practice-based outcomes.

  3. Probability of defect detection of Posiva's electron beam weld

    International Nuclear Information System (INIS)

    Kanzler, D.; Mueller, C.; Pitkaenen, J.

    2013-12-01

    The report 'Probability of Defect Detection of Posiva's electron beam weld' describes POD curves of four NDT methods radiographic testing, ultrasonic testing, eddy current testing and visual testing. POD-curves are based on the artificial defects in reference blocks. The results are devoted to the demonstration of suitability of the methods for EB weld testing. Report describes methodology and procedure applied by BAM. Report creates a link from the assessment of the reliability and inspection performance to the risk assessment process of the canister final disposal project. Report ensures the confirmation of the basic quality of the NDT methods and their capability to describe the quality of the EB-weld. The probability of detection curves are determined based on the MIL-1823 standard and it's reliability guidelines. The MIL-1823 standard was developed for the determination of integrity of gas turbine engines for the US military. In the POD-process there are determined as a key parameter for the defect detectability the a90/95 magnitudes, i.e. the size measure a of the defect, for which the lower 95 % confidence band crosses the 90 % POD level. By this way can be confirmed that defects with a size of a90/95 will be detected with 90 % probability. In case the experiment will be repeated 5 % might fall outside this confidence limit. (orig.)

  4. Innovative thin silicon detectors for monitoring of therapeutic proton beams: preliminary beam tests

    Science.gov (United States)

    Vignati, A.; Monaco, V.; Attili, A.; Cartiglia, N.; Donetti, M.; Fadavi Mazinani, M.; Fausti, F.; Ferrero, M.; Giordanengo, S.; Hammad Ali, O.; Mandurrino, M.; Manganaro, L.; Mazza, G.; Sacchi, R.; Sola, V.; Staiano, A.; Cirio, R.; Boscardin, M.; Paternoster, G.; Ficorella, F.

    2017-12-01

    To fully exploit the physics potentials of particle therapy in delivering dose with high accuracy and selectivity, charged particle therapy needs further improvement. To this scope, a multidisciplinary project (MoVeIT) of the Italian National Institute for Nuclear Physics (INFN) aims at translating research in charged particle therapy into clinical outcome. New models in the treatment planning system are being developed and validated, using dedicated devices for beam characterization and monitoring in radiobiological and clinical irradiations. Innovative silicon detectors with internal gain layer (LGAD) represent a promising option, overcoming the limits of currently used ionization chambers. Two devices are being developed: one to directly count individual protons at high rates, exploiting the large signal-to-noise ratio and fast collection time in small thicknesses (1 ns in 50 μm) of LGADs, the second to measure the beam energy with time-of-flight techniques, using LGADs optimized for excellent time resolutions (Ultra Fast Silicon Detectors, UFSDs). The preliminary results of first beam tests with therapeutic beam will be presented and discussed.

  5. Five-Kilometers Time Trial: Preliminary Validation of a Short Test for Cycling Performance Evaluation.

    Science.gov (United States)

    Dantas, Jose Luiz; Pereira, Gleber; Nakamura, Fabio Yuzo

    2015-09-01

    The five-kilometer time trial (TT5km) has been used to assess aerobic endurance performance without further investigation of its validity. This study aimed to perform a preliminary validation of the TT5km to rank well-trained cyclists based on aerobic endurance fitness and assess changes of the aerobic endurance performance. After the incremental test, 20 cyclists (age = 31.3 ± 7.9 years; body mass index = 22.7 ± 1.5 kg/m(2); maximal aerobic power = 360.5 ± 49.5 W) performed the TT5km twice, collecting performance (time to complete, absolute and relative power output, average speed) and physiological responses (heart rate and electromyography activity). The validation criteria were pacing strategy, absolute and relative reliability, validity, and sensitivity. Sensitivity index was obtained from the ratio between the smallest worthwhile change and typical error. The TT5km showed high absolute (coefficient of variation 0.95) reliability of performance variables, whereas it presented low reliability of physiological responses. The TT5km performance variables were highly correlated with the aerobic endurance indices obtained from incremental test (r > 0.70). These variables showed adequate sensitivity index (> 1). TT5km is a valid test to rank the aerobic endurance fitness of well-trained cyclists and to differentiate changes on aerobic endurance performance. Coaches can detect performance changes through either absolute (± 17.7 W) or relative power output (± 0.3 W.kg(-1)), the time to complete the test (± 13.4 s) and the average speed (± 1.0 km.h(-1)). Furthermore, TT5km performance can also be used to rank the athletes according to their aerobic endurance fitness.

  6. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  7. Thermal and mechanical quantitative sensory testing in Chinese patients with burning mouth syndrome--a probable neuropathic pain condition?

    Science.gov (United States)

    Mo, Xueyin; Zhang, Jinglu; Fan, Yuan; Svensson, Peter; Wang, Kelun

    2015-01-01

    To explore the hypothesis that burning mouth syndrome (BMS) probably is a neuropathic pain condition, thermal and mechanical sensory and pain thresholds were tested and compared with age- and gender-matched control participants using a standardized battery of psychophysical techniques. Twenty-five BMS patients (men: 8, women: 17, age: 49.5 ± 11.4 years) and 19 age- and gender-matched healthy control participants were included. The cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical detection threshold (MDT) and mechanical pain threshold (MPT), in accordance with the German Network of Neuropathic Pain guidelines, were measured at the following four sites: the dorsum of the left hand (hand), the skin at the mental foramen (chin), on the tip of the tongue (tongue), and the mucosa of the lower lip (lip). Statistical analysis was performed using ANOVA with repeated measures to compare the means within and between groups. Furthermore, Z-score profiles were generated, and exploratory correlation analyses between QST and clinical variables were performed. Two-tailed tests with a significance level of 5 % were used throughout. CDTs (P < 0.02) were significantly lower (less sensitivity) and HPTs (P < 0.001) were significantly higher (less sensitivity) at the tongue and lip in BMS patients compared to control participants. WDT (P = 0.007) was also significantly higher at the tongue in BMS patients compared to control subjects . There were no significant differences in MDT and MPT between the BMS patients and healthy subjects at any of the four test sites. Z-scores showed that significant loss of function can be identified for CDT (Z-scores = -0.9±1.1) and HPT (Z-scores = 1.5±0.4). There were no significant correlations between QST and clinical variables (pain intensity, duration, depressions scores). BMS patients had a significant loss of thermal function but not

  8. Preliminary rock mechanics laboratory: Investigation plan

    International Nuclear Information System (INIS)

    Oschman, K.P.; Hummeldorf, R.G.; Hume, H.R.; Karakouzian, M.; Vakili, J.E.

    1987-01-01

    This document presents the rationale for rock mechanics laboratory testing (including the supporting analysis and numerical modeling) planned for the site characterization of a nuclear waste repository in salt. This plan first identifies what information is required for regulatory and design purposes, and then presents the rationale for the testing that satisfies the required information needs. A preliminary estimate of the minimum sampling requirements for rock laboratory testing during site characterization is also presented. Periodic revision of this document is planned

  9. New tests of cumulative prospect theory and the priority heuristic

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-04-01

    Full Text Available Previous tests of cumulative prospect theory (CPT and of the priority heuristic (PH found evidence contradicting these two models of risky decision making. However, those tests were criticized because they had characteristics that might ``trigger'' use of other heuristics. This paper presents new tests that avoid those characteristics. Expected values of the gambles are nearly equal in each choice. In addition, if a person followed expected value (EV, expected utility (EU, CPT, or PH in these tests, she would shift her preferences in the same direction as shifts in EV or EU. In contrast, the transfer of attention exchange model (TAX and a similarity model predict that people will reverse preferences in the opposite direction. Results contradict the PH, even when PH is modified to include a preliminary similarity evaluation using the PH parameters. New tests of probability-consequence interaction were also conducted. Strong interactions were observed, contrary to PH. These results add to the growing bodies of evidence showing that neither CPT nor PH is an accurate description of risky decision making.

  10. Preliminary results of ecotoxicological assessment of an Acid Mine Drainage (AMD) passive treatment system testing water quality of depurated lixiviates

    OpenAIRE

    Miguel Sarmiento, Aguasanta; Bonnail, Estefanía; Nieto Liñán, José Miguel; Valls Casillas, Tomás Ángel del

    2017-01-01

    The current work reports on the preliminary results of a toxicity test using screening experiments to check the efficiency of an innovative passive treatment plant designed for acid mine drainage purification. Bioassays took place with water samples before and after the treatment system and in the river, once treated water is discharged. Due to the high toxicity of the water collected at the mouth of the mine (before the treatment plant), the bioassay was designed and developed with respect t...

  11. Application of tests of goodness of fit in determining the probability density function for spacing of steel sets in tunnel support system

    Directory of Open Access Journals (Sweden)

    Farnoosh Basaligheh

    2015-12-01

    Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.

  12. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  13. Preliminary Evaluation of the Effects of Buried Volcanoes on Estimates of Volcano Probability for the Proposed Repository Site at Yucca Mountain, Nevada

    Science.gov (United States)

    Hill, B. E.; La Femina, P. C.; Stamatakos, J.; Connor, C. B.

    2002-12-01

    Probability models that calculate the likelihood of new volcano formation in the Yucca Mountain (YM) area depend on the timing and location of past volcanic activity. Previous spatio-temporal patterns indicated a 10-4 to 10-3 probability of volcanic disruption of the proposed radioactive waste repository site at YM during the 10,000 year post-closure performance period (Connor et al. 2000, JGR 105:1). A recent aeromagnetic survey (Blakely et al. 2000, USGS OFR 00-188), however, identified up to 20 anomalies in alluvium-filled basins, which have characteristics indicative of buried basalt (O'Leary et al. 2002, USGS OFR 02-020). Independent evaluation of these data, combined with new ground magnetic surveys, shows that these anomalies may represent at least ten additional buried basaltic volcanoes, which have not been included in previous probability calculations. This interpretation, if true, nearly doubles the number of basaltic volcanoes within 30 km [19 mi] of YM. Moreover, the magnetic signature of about half of the recognized basaltic volcanoes in the YM area cannot be readily identified in areas where bedrock also produces large amplitude magnetic anomalies, suggesting that additional volcanoes may be present but undetected in the YM area. In the absence of direct age information, we evaluate the potential effects of alternative age assumptions on spatio-temporal probability models. Interpreted burial depths of >50 m [164 ft] suggest ages >2 Ma, based on sedimentation rates typical for these alluvial basins (Stamatakos et al., 1997, J. Geol. 105). Defining volcanic events as individual points, previous probability models generally used recurrence rates of 2-5 volcanoes/million years (v/Myr). If the identified anomalies are buried volcanoes that are all >5 Ma or uniformly distributed between 2-10 Ma, calculated probabilities of future volcanic disruption at YM change by <30%. However, a uniform age distribution between 2-5 Ma for the presumed buried volcanoes

  14. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  15. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  2. Preliminary seismic design of dynamically coupled structural systems

    International Nuclear Information System (INIS)

    Pal, N.; Dalcher, A.W.; Gluck, R.

    1977-01-01

    In this paper, the analysis criteria for coupling and decoupling, which are most commonly used in nuclear design practice, are briefly reviewed and a procedure outlined and demonstrated with examples. Next, a criterion judged to be practical for preliminary seismic design purposes is defined. Subsequently, a technique compatible with this criterion is suggested. A few examples are presented to test the proposed procedure for preliminary seismic design purposes. Limitations of the procedure are also discussed and finally, the more important conclusions are summarized

  3. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  4. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  5. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  6. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    International Nuclear Information System (INIS)

    Davidson, J.R.

    1994-10-01

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer's 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer's published ELIPGRID results. An apparent error in the original ELIPGRID code has been uncovered and an appropriate modification incorporated into the new program

  7. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  8. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Vol. 1: Third comparison with 40 CFR 191, Subpart B

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-15

    Before disposing of transuranic radioactive wastes in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments of the WIPP for the DOE to provide interim guidance while preparing for final compliance evaluations. This volume contains an overview of WIPP performance assessment and a preliminary comparison with the long-term requirements of the Environmental Radiation Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Detailed information about the technical basis for the preliminary comparison is contained in Volume 2. The reference data base and values for input parameters used in the modeling system are contained in Volume 3. Uncertainty and sensitivity analyses related to 40 CFR 191B are contained in Volume 4. Volume 5 contains uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance. Finally, guidance derived from the entire 1992 performance assessment is presented in Volume 6. Results of the 1992 performance assessment are preliminary, and are not suitable for final comparison with 40 CFR 191, Subpart B. Portions of the modeling system and the data base remain incomplete, and the level of confidence in the performance estimates is not sufficient for a defensible compliance evaluation. Results are, however, suitable for providing guidance to the WIPP Project. All results are conditional on the models and data used, and are presented for preliminary comparison to the Containment Requirements of 40 CFR 191, Subpart B as mean complementary cumulative distribution functions (CCDFs) displaying estimated probabilistic releases of radionuclides to the accessible environment. Results compare three conceptual models for

  9. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Vol. 1: Third comparison with 40 CFR 191, Subpart B

    International Nuclear Information System (INIS)

    1992-12-01

    Before disposing of transuranic radioactive wastes in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments of the WIPP for the DOE to provide interim guidance while preparing for final compliance evaluations. This volume contains an overview of WIPP performance assessment and a preliminary comparison with the long-term requirements of the Environmental Radiation Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Detailed information about the technical basis for the preliminary comparison is contained in Volume 2. The reference data base and values for input parameters used in the modeling system are contained in Volume 3. Uncertainty and sensitivity analyses related to 40 CFR 191B are contained in Volume 4. Volume 5 contains uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance. Finally, guidance derived from the entire 1992 performance assessment is presented in Volume 6. Results of the 1992 performance assessment are preliminary, and are not suitable for final comparison with 40 CFR 191, Subpart B. Portions of the modeling system and the data base remain incomplete, and the level of confidence in the performance estimates is not sufficient for a defensible compliance evaluation. Results are, however, suitable for providing guidance to the WIPP Project. All results are conditional on the models and data used, and are presented for preliminary comparison to the Containment Requirements of 40 CFR 191, Subpart B as mean complementary cumulative distribution functions (CCDFs) displaying estimated probabilistic releases of radionuclides to the accessible environment. Results compare three conceptual models for

  10. Heart sounds analysis using probability assessment.

    Science.gov (United States)

    Plesinger, F; Viscor, I; Halamek, J; Jurco, J; Jurak, P

    2017-07-31

    This paper describes a method for automated discrimination of heart sounds recordings according to the Physionet Challenge 2016. The goal was to decide if the recording refers to normal or abnormal heart sounds or if it is not possible to decide (i.e. 'unsure' recordings). Heart sounds S1 and S2 are detected using amplitude envelopes in the band 15-90 Hz. The averaged shape of the S1/S2 pair is computed from amplitude envelopes in five different bands (15-90 Hz; 55-150 Hz; 100-250 Hz; 200-450 Hz; 400-800 Hz). A total of 53 features are extracted from the data. The largest group of features is extracted from the statistical properties of the averaged shapes; other features are extracted from the symmetry of averaged shapes, and the last group of features is independent of S1 and S2 detection. Generated features are processed using logical rules and probability assessment, a prototype of a new machine-learning method. The method was trained using 3155 records and tested on 1277 hidden records. It resulted in a training score of 0.903 (sensitivity 0.869, specificity 0.937) and a testing score of 0.841 (sensitivity 0.770, specificity 0.913). The revised method led to a test score of 0.853 in the follow-up phase of the challenge. The presented solution achieved 7th place out of 48 competing entries in the Physionet Challenge 2016 (official phase). In addition, the PROBAfind software for probability assessment was introduced.

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  13. 用于统计测试概率分布生成的自动搜索方法%Automated Search Method for Statistical Test Probability Distribution Generation

    Institute of Scientific and Technical Information of China (English)

    周晓莹; 高建华

    2013-01-01

    A strategy based on automated search for probability distribution construction is proposed, which comprises the design of representation format and evaluation function for the probability distribution. Combining with simulated annealing algorithm, an indicator is defined to formalize the automated search process based on the Markov model. Experimental results show that the method effectively improves the accuracy of the automated search, which can reduce the expense of statistical test by providing the statistical test with fairly efficient test data since it successfully finds the neat-optimal probability distribution within a certain time.%提出一种基于自动搜索的概率分布生成方法,设计对概率分布的表示形式与评估函数,同时结合模拟退火算法设计基于马尔可夫模型的自动搜索过程.实验结果表明,该方法能够有效地提高自动搜索的准确性,在一定时间内成功找到接近最优的概率分布,生成高效的测试数据,同时达到降低统计测试成本的目的.

  14. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  15. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  16. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  17. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  18. PHOEBUS/UHTREX: a preliminary study of a low-cost facility for transient tests of LMFBR fuel

    International Nuclear Information System (INIS)

    Kirk, W.L.

    1976-08-01

    The results of a brief preliminary design study of a facility for transient nuclear tests of fast breeder reactor fuel are described. The study is based on the use of a reactor building originally built for the UHTREX reactor, and the use of some reactor hardware and reactor design and fabrication technology remaining from the Phoebus-2 reactor of the Rover nulcear rocket propulsion program. The facility is therefore currently identified as the PHOEBUS/UHTREX facility. This facility is believed capable of providing early information regarding fast reactor core accident energetics issues which will be very valuable to the overall LMFBR safety program. Facility performance in conjunction with a reference 127-fuel pin experiment is described. Low cost and early availability of the facility were emphasized in the selection of design features and parameters

  19. PHOEBUS/UHTREX: a preliminary study of a low-cost facility for transient tests of LMFBR fuel

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, W.L. (comp.)

    1976-08-01

    The results of a brief preliminary design study of a facility for transient nuclear tests of fast breeder reactor fuel are described. The study is based on the use of a reactor building originally built for the UHTREX reactor, and the use of some reactor hardware and reactor design and fabrication technology remaining from the Phoebus-2 reactor of the Rover nulcear rocket propulsion program. The facility is therefore currently identified as the PHOEBUS/UHTREX facility. This facility is believed capable of providing early information regarding fast reactor core accident energetics issues which will be very valuable to the overall LMFBR safety program. Facility performance in conjunction with a reference 127-fuel pin experiment is described. Low cost and early availability of the facility were emphasized in the selection of design features and parameters.

  20. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  1. Some uses of predictive probability of success in clinical drug development

    Directory of Open Access Journals (Sweden)

    Mauro Gasparini

    2013-03-01

    Full Text Available Predictive probability of success is a (subjective Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations imposed in the world of pharmaceutical development.Within a single trial, predictive probability of success can be identified with expected power, i.e. the evaluation of the success probability of the trial. Success means, for example, obtaining a significant result of a standard superiority test.Across trials, predictive probability of success can be the probability of a successful completion of an entire part of clinical development, for example a successful phase III development in the presence of phase II data.Calculations of predictive probability of success in the presence of normal data with known variance will be illustrated, both for within-trial and across-trial predictions.

  2. Method of preliminary localization of the iris in biometric access control systems

    Science.gov (United States)

    Minacova, N.; Petrov, I.

    2015-10-01

    This paper presents a method of preliminary localization of the iris, based on the stable brightness features of the iris in images of the eye. In tests on images of eyes from publicly available databases method showed good accuracy and speed compared to existing methods preliminary localization.

  3. Preliminary piping layout and integration of European test blanket modules subsystems in ITER CVCS area

    Energy Technology Data Exchange (ETDEWEB)

    Tarallo, Andrea, E-mail: andrea.tarallo@unina.it [CREATE, University of Naples Federico II, DII, P.le Tecchio, 80, 80125 Naples (Italy); Mozzillo, Rocco; Di Gironimo, Giuseppe [CREATE, University of Naples Federico II, DII, P.le Tecchio, 80, 80125 Naples (Italy); Aiello, Antonio; Utili, Marco [ENEA UTIS, C.R. Brasimone, Bacino del Brasimone, I-40032 Camugnano, BO (Italy); Ricapito, Italo [TBM& MD Project, Fusion for Energy, EU Commission, Carrer J. Pla, 2, Building B3, 08019 Barcelona (Spain)

    2015-04-15

    Highlights: • The use of human modeling tools for piping design in view of maintenance is discussed. • A possible preliminary layout for TBM subsystems in CVCS area has been designed with CATIA. • A DHM-based method to quickly check for maintainability of piping systems is suggested. - Abstract: This paper explores a possible integration of some ancillary systems of helium-cooled lithium lead (HCLL) and helium-cooled pebble-bed (HCPB) test blanket modules in ITER CVCS area. Computer-aided design and ergonomics simulation tools have been fundamental not only to define suitable routes for pipes, but also to quickly check for maintainability of equipment and in-line components. In particular, accessibility of equipment and systems has been investigated from the very first stages of the design using digital human models. In some cases, the digital simulations have resulted in changes in the initial space reservations.

  4. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  5. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  6. Preliminary design and definition of field experiments for welded tuff rock mechanics program

    International Nuclear Information System (INIS)

    Zimmerman, R.M.

    1982-06-01

    The preliminary design contains objectives, typical experiment layouts, definitions of equipment and instrumentation, test matrices, preliminary design predictive modeling results for five experiments, and a definition of the G-Tunnel Underground Facility (GTUF) at the Nevada Test Site where the experiments are to be located. Experiments described for investigations in welded tuff are the Small Diameter Heater, Unit Cell-Canister Scale, Heated Block, Rocha Slot, and Miniature Heater

  7. Cytologic diagnosis: expression of probability by clinical pathologists.

    Science.gov (United States)

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  8. A preliminary investigation of the imaging performance of photostimulable phosphor computed radiography using a new design of mammographic quality control test object

    International Nuclear Information System (INIS)

    Cowen, A.R.; Brettle, D.S.; Coleman, N.J.; Parkin, G.J.S.

    1992-01-01

    Leeds Test Object TOR[MAM] has been designed to supplement the current FAXIL mammography test object TOR[MAX]. It contains a range of details that have a more natural radiographic appearance and has been designed as a test that more closely approximates the image quality achieved in clinical mammography. Physical aspects of the design and implementation of TOR[MAM] are presented. The TOR[MAM] has been used in a preliminary physical evaluation of the comparative image qualities produced by conventional (screen-film) and phostostimulable phosphor computed mammography and the results are discussed. TOR[MAX] results are also presented. The influence of digital image processing (enhancement) on the image quality of computed mammograms is also considered. The results presented indicate the sensitivity of TOR[MAM]. (author)

  9. Probable maximum flood analysis, Richton Dome, Mississippi-Phase I: Technical report

    International Nuclear Information System (INIS)

    1987-03-01

    This report presents results of a preliminary analysis of the extent of inundation that would result from a probable maximum flood (PMF) event in the overdome area of Richton Dome, Mississippi. Bogue Homo and Thompson Creek watersheds drain the overdome area. The US Army Corps of Engineers' HEC-1 Flood Hydrograph Package was used to calculate runoff hydrographs, route computed flood hydrographs, and determine maximum flood stages at cross sections along overdome tributaries. The area and configuration of stream cross sections were determined from US Geological Survey topographic maps. Using maximum flood stages calculated by the HEC-1 analysis, areas of inundation were delineated on 10-ft (3-m) contour interval topographic maps. Approximately 10% of the overdome area, or 0.9 mi 2 (2 km 2 ), would be inundated by a PMF event. 34 refs., 3 figs., 1 tab

  10. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of three basalt waste isolation project staff hydrologists

    International Nuclear Information System (INIS)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The present study implemented a probability encoding method to estimate the probability distributions of selected hydrologic variables for the Cohassett basalt flow top and flow interior, and the anisotropy ratio of the interior of the Cohassett basalt flow beneath the Hanford Site. Site-speciic data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. However, this information is required to complete preliminary performance assessment studies. Rockwell chose a probability encoding method developed by SRI International to generate credible and auditable estimates of the probability distributions of effective porosity and hydraulic conductivity anisotropy. The results indicate significant differences of opinion among the experts. This was especially true of the values of the effective porosity of the Cohassett basalt flow interior for which estimates differ by more than five orders of magnitude. The experts are in greater agreement about the values of effective porosity of the Cohassett basalt flow top; their estimates for this variable are generally within one to two orders of magnitiude of each other. For anisotropy ratio, the expert estimates are generally within two or three orders of magnitude of each other. Based on this study, the Rockwell hydrologists estimate the effective porosity of the Cohassett basalt flow top to be generally higher than do the independent experts. For the effective porosity of the Cohassett basalt flow top, the estimates of the Rockwell hydrologists indicate a smaller uncertainty than do the estimates of the independent experts. On the other hand, for the effective porosity and anisotropy ratio of the Cohassett basalt flow interior, the estimates of the Rockwell hydrologists indicate a larger uncertainty than do the estimates of the independent experts

  11. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  12. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  13. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  14. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    Science.gov (United States)

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  15. Experimental study of sodium droplet burning in free fall. Evaluation of preliminary test results

    International Nuclear Information System (INIS)

    Miyahara, Shinya; Ara, Kuniaki

    1998-08-01

    To study a sodium leak and combustion behavior phenomenologically and to construct the mechanistic evaluation method, an experimental series of a sodium droplet burning in free fall is under way. In this study, the accuracy of measurement technique used in the preliminary test was assessed and the modified technique was proposed for the next test series. Analytical study of the test results was also conducted to deduce dominant parameters and important measurement items which would play an important role in the droplet combustion behavior. The results and conclusions are as follows: (1) Assessment of measurement accuracy and modified technique proposed for the next test series. a) Control accuracy of sodium supply system using β-alumina solid electrolyte was sufficient for generation of objective size of single droplet. However, it is necessary to calibrate the correlation between the quantity of electric charge for sodium supply system and that of supplied sodium. b) Measurement accuracy of falling velocity using high-speed video was ±0.33 m/s at an upper part and ±0.48 m/s at a lower part of the measurement. To reduce the error, a high-speed stroboscopic method is recommended to measure the falling velocity of droplet. (2) Results of analytical study and deduced dominant parameters and important measurement items. a) The falling behavior of a burning droplet was described solving the equation of free falling motion for a rigid sphere. In the case of higher falling height, it is necessary to study the burning effects on the falling behavior. b) The mass burned of a falling droplet was calculated using the combustion model according to 'D 2 ' law during the full burning phase. It is necessary to study the dominant chemical reaction in the burning flame because the mass burned depends on the composition of the reaction products. c) The mass burned was calculated using surface oxidation model for preignition phase together with above model. However, it is

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Preliminary studies on the behavioural effects of the methanol ...

    African Journals Online (AJOL)

    The behavioural tests employed were diazepam-induced sleep onset and duration, hole board assay for exploratory activity, mouse beam walk assay for motor coordination, and the staircase test for the detection of anxiolytic compounds. Preliminary phytochemical screening was also carried out on the extract. Results: The ...

  18. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  19. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  20. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992

    International Nuclear Information System (INIS)

    1992-12-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume, Volume 2, contains the technical basis for the 1992 PA. Specifically, it describes the conceptual basis for consequence modeling and the PA methodology, including the selection of scenarios for analysis, the determination of scenario probabilities, and the estimation of scenario consequences using a Monte Carlo technique and a linked system of computational models. Additional information about the 1992 PA is provided in other volumes. Volume I contains an overview of WIPP PA and results of a preliminary comparison with the long-term requirements of the EPA's Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses related to the preliminary comparison with 40 CFR 191B. Volume 5 contains uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance. Finally, guidance derived from the entire 1992 PA is presented in Volume 6

  1. Hydrogen Gas Retention and Release from WTP Vessels: Summary of Preliminary Studies

    Energy Technology Data Exchange (ETDEWEB)

    Gauglitz, Phillip A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bontha, Jagannadha R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daniel, Richard C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mahoney, Lenna A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rassat, Scot D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wells, Beric E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bao, Jie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Boeringa, Gregory K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Buchmiller, William C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burns, Carolyn A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chun, Jaehun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Karri, Naveen K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Li, Huidong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tran, Diana N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    The Hanford Waste Treatment and Immobilization Plant (WTP) is currently being designed and constructed to pretreat and vitrify a large portion of the waste in the 177 underground waste storage tanks at the Hanford Site. A number of technical issues related to the design of the pretreatment facility (PTF) of the WTP have been identified. These issues must be resolved prior to the U.S. Department of Energy (DOE) Office of River Protection (ORP) reaching a decision to proceed with engineering, procurement, and construction activities for the PTF. One of the issues is Technical Issue T1 - Hydrogen Gas Release from Vessels (hereafter referred to as T1). The focus of T1 is identifying controls for hydrogen release and completing any testing required to close the technical issue. In advance of selecting specific controls for hydrogen gas safety, a number of preliminary technical studies were initiated to support anticipated future testing and to improve the understanding of hydrogen gas generation, retention, and release within PTF vessels. These activities supported the development of a plan defining an overall strategy and approach for addressing T1 and achieving technical endpoints identified for T1. Preliminary studies also supported the development of a test plan for conducting testing and analysis to support closing T1. Both of these plans were developed in advance of selecting specific controls, and in the course of working on T1 it was decided that the testing and analysis identified in the test plan were not immediately needed. However, planning activities and preliminary studies led to significant technical progress in a number of areas. This report summarizes the progress to date from the preliminary technical studies. The technical results in this report should not be used for WTP design or safety and hazards analyses and technical results are marked with the following statement: “Preliminary Technical Results for Planning – Not to be used for WTP Design

  2. Preliminary development of an advanced modular pressure relief cushion: Testing and user evaluation.

    Science.gov (United States)

    Freeto, Tyler; Mitchell, Steven J; Bogie, Kath M

    2018-02-01

    Effective pressure relief cushions are identified as a core assistive technology need by the World Health Organization Global Cooperation on Assistive Technology. High quality affordable wheelchair cushions could provide effective pressure relief for many individuals with limited access to advanced assistive technology. Value driven engineering (VdE) principles were employed to develop a prototype modular cushion. Low cost dynamically responsive gel balls were arranged in a close packed array and seated in bilayer foam for containment and support. Two modular cushions, one with high compliance balls and one with moderate compliance balls were compared with High Profile and Low Profile Roho ® and Jay ® Medical 2 cushions. ISO 16480-2 biomechanical standardized tests were applied to assess cushion performance. A preliminary materials cost analysis was carried out. A prototype modular cushion was evaluated by 12 participants who reported satisfaction using a questionnaire based on the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST 2.0) instrument. Overall the modular cushions performed better than, or on par with, the most widely prescribed commercially available cushions under ISO 16480-2 testing. Users rated the modular cushion highly for overall appearance, size and dimensions, comfort, safety, stability, ease of adjustment and general ease of use. Cost-analysis indicated that every modular cushion component a could be replaced several times and still maintain cost-efficacy over the complete cushion lifecycle. A VdE modular cushion has the potential provide effective pressure relief for many users at a low lifetime cost. Copyright © 2017. Published by Elsevier Ltd.

  3. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  4. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  5. Growth and profitability in small privately held biotech firms: preliminary findings.

    Science.gov (United States)

    Brännback, Malin; Carsrud, Alan; Renko, Maija; Ostermark, Ralf; Aaltonen, Jaana; Kiviluoto, Niklas

    2009-06-01

    This paper reports on preliminary findings on a study of the relationship of growth and profitability among small privately held Finnish Life Science firms. Previous research results concerning growth and profitability are mixed, ranging from strongly positive to a negative relationship. The conventional wisdom states that growth is a prerequisite for profitability. Our results suggest that the reverse is the case. A high profitability-low growth biotech firm is more probably to make the transition to high profitability-high growth than a firm that starts off with low profitability and high growth.

  6. Preliminary Mass Spectrometric Analysis of Uranium on Environmental Swipe Materials

    International Nuclear Information System (INIS)

    Cheong, Chang-Sik; Jeong, Youn-Joong; Ryu, Jong-Sik; Shin, Hyung-Seon; Cha, Hyun-Ju; Ahn, Gil-Hoon; Park, Il-Jin; Min, Gyung-Sik

    2006-01-01

    It is well-known that uranium and plutonium isotopic compositions of safeguards samples are very useful to investigate the history of nuclear activities. To strengthen the capabilities of environmental sampling analysis in the ROK through MOST/DOE collaboration, round robin test for uranium and plutonium was designed in 2003. As the first round robin test, a set of dried uranium-containing solutions (∼35ng and (∼300ng) was distributed to the participating laboratories in November of 2003, with results reported in April of 2004. The KBSI (Korea Basic Science Institute) and ORNL (Oak Ridge National Laboratory) are currently in the process of analyzing uranium on cotton swipes for the second round robin test. As a preliminary test for the second round, KBSI intends to analyze home-made swipe samples into which international uranium standards are added. Here we describe technical steps of sample preparation and mass spectrometry at KBSI, and report some results of the preliminary test

  7. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  8. Phytochemical Screening and Preliminary Evaluation of Analgesic ...

    African Journals Online (AJOL)

    In this study, the methanolic root extract of Cissus polyantha was subjected to preliminary phytochemical screening, analgesic and anti-inflammatory studies. Phytochemical studies was carried out using standard phytochemical protocol while the analgesic studies was carried out using acetic acid-induced writhing tests in ...

  9. Preliminary safety analysis report for the TFTR

    International Nuclear Information System (INIS)

    Lind, K.E.; Levine, J.D.; Howe, H.J.

    A Preliminary Safety Analysis Report has been prepared for the Tokamak Fusion Test Reactor. No accident scenarios have been identified which would result in exposures to on-site personnel or the general public in excess of the guidelines defined for the project by DOE

  10. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  11. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  12. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  13. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    Science.gov (United States)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  14. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  15. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  16. Prospective Evaluation of Serum β-Glucan Testing in Patients With Probable or Proven Fungal Diseases

    Science.gov (United States)

    Angebault, Cécile; Lanternier, Fanny; Dalle, Frédéric; Schrimpf, Cécile; Roupie, Anne-Laure; Dupuis, Aurélie; Agathine, Aurélie; Scemla, Anne; Paubelle, Etienne; Caillot, Denis; Neven, Bénédicte; Frange, Pierre; Suarez, Felipe; d'Enfert, Christophe; Lortholary, Olivier; Bougnoux, Marie-Elisabeth

    2016-01-01

    Background. Early diagnosis and treatment are crucial in invasive fungal diseases (IFD). Serum (1-3)-β-d-glucan (BG) is believed to be an early IFD marker, but its diagnostic performance has been ambiguous, with insufficient data regarding sensitivity at the time of IFD diagnosis (TOD) and according to outcome. Whether its clinical utility is equivalent for all types of IFD remains unknown. Methods. We included 143 patients with proven or probable IFD (49 invasive candidiasis, 45 invasive aspergillosis [IA], and 49 rare IFD) and analyzed serum BG (Fungitell) at TOD and during treatment. Results. (1-3)-β-d-glucan was undetectable at TOD in 36% and 48% of patients with candidemia and IA, respectively; there was no correlation between negative BG results at TOD and patients' characteristics, localization of infection, or prior antifungal use. Nevertheless, patients with candidemia due to Candida albicans were more likely to test positive for BG at TOD (odds ratio = 25.4, P = .01) than patients infected with other Candida species. In 70% of the patients with a follow-up, BG negativation occurred in >1 month for candidemia and >3 months for IA. A slower BG decrease in patients with candidemia was associated with deep-seated localizations (P = .04). Thirty-nine percent of patients with rare IFD had undetectable BG at TOD; nonetheless, all patients with chronic subcutaneous IFD tested positive at TOD. Conclusions. Undetectable serum BG does not rule out an early IFD, when the clinical suspicion is high. After IFD diagnostic, kinetics of serum BG are difficult to relate to clinical outcome. PMID:27419189

  17. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  18. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  19. Comparing a recursive digital filter with the moving-average and sequential probability-ratio detection methods for SNM portal monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1993-01-01

    The author compared a recursive digital filter proposed as a detection method for French special nuclear material monitors with the author's detection methods, which employ a moving-average scaler or a sequential probability-ratio test. Each of these nine test subjects repeatedly carried a test source through a walk-through portal monitor that had the same nuisance-alarm rate with each method. He found that the average detection probability for the test source is also the same for each method. However, the recursive digital filter may have on drawback: its exponentially decreasing response to past radiation intensity prolongs the impact of any interference from radiation sources of radiation-producing machinery. He also examined the influence of each test subject on the monitor's operation by measuring individual attenuation factors for background and source radiation, then ranked the subjects' attenuation factors against their individual probabilities for detecting the test source. The one inconsistent ranking was probably caused by that subject's unusually long stride when passing through the portal

  20. Preliminary Report: DESiGN and Test Result of KSR-3 Rocket Magnetometers

    Directory of Open Access Journals (Sweden)

    Hyo-Min Kim

    2000-12-01

    Full Text Available The solar wind contributes to the formation of unique space environment called the Earth's magnetosphere by various interactions with the Earth's magnetic field. Thus the solar-terrestrial environment affects the Earth's magnetic field, which can be observed with an instrument for the magnetic field measurement, the magnetometer usually mounted on the rocket and the satellite and based on the ground observatory. The magnetometer is a useful instrument for the spacecraft attitude control as well as the Earth's magnetic field measurements for a scientific purpose. In this paper, we present the preliminary design and test results of the two onboard magnetometers of KARI's (Korea Aerospace Research Institute sounding rocket, KSR-3, which will be launched four times during the period of 2001-02. The KSR-3 magnetometers consist of the fluxgate magnetometer, MAG/AIM (Attitude Information Magnetometer for acquiring the rocket flight attitude information, and of the search-coil magnetometer, MAG/SIM (Scientific Investigation Magnetometer for the observation of the Earth's magnetic field fluctuations. With the MAG/AIM, the 3-axis attitude information can be acquired by the comparison of the resulting dc magnetic vector field with the IGRF (International Geomagnetic Reference Field. The Earth's magnetic field fluctuations ranging from 10 to 1,000 Hz can also be observed with the MAG/SIM measurement.

  1. Augmented Reality Cubes for Cognitive Gaming: Preliminary Usability and Game Experience Testing

    Directory of Open Access Journals (Sweden)

    Costas Boletsis

    2016-03-01

    Full Text Available Early detection is important in dementia care; however, cognitive impairment is still under-recognised and under-diagnosed. Cognitive screening and training are two important preventative treatments, which can lead to early detection of cognitive decline. In this work, the “Cognitive Augmented Reality Cubes” (CogARC system is presented, i.e. a serious game for cognitive training and screening, utilising an interaction technique based on Augmented Reality and the manipulation of tangible, physical objects (cubes. The game is a collection of cognitive mini-games of preventative nature and is, primarily, targeting elderly players (≥60 years old. A preliminary testing was conducted focusing on the game experience that CogARC offers (utilising the In-Game Experience Questionnaire, the usability of the system (using the System Usability Scale, and the specific user observations and remarks, as documented by open, semi-structured interviews.  Overall, CogARC demonstrated satisfying positive responses, however, the negative reactions indicated that there are specific problems with aspects of the interaction technique and a number of mini-games. The open interview shed more light on the specific issues of each mini-game and further interpretation of user interactions. The current study managed to provide interesting insights into the game design elements, integration of Augmented Reality, tangible interaction of the system, and on how elderly players perceive and use those interaction components. 

  2. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  3. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  4. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  5. Measurements of transition probabilities in the range from vacuum ultraviolet to infrared

    International Nuclear Information System (INIS)

    Peraza Fernandez, M.C.

    1992-01-01

    In this memory we describe the design, testing and calibration of different spectrometers to measure transition probabilities from the vacuum ultraviolet to the infrared spectral region. For the infrared measurements we have designed and performed a phase sensitive detection system, using an InGaAs photodiode like detector. With this system we have determined the transition probabilities of infrared lines of KrI and XeI. For these lines we haven't found previous measurements. In the vacuum ultraviolet spectral region we have designed a 3 m normal incidence monochromator where we have installed an optical multichannel analyzer. We have tested its accurate working, obtaining the absorption spectrum of KrI. In the visible region we have obtained the emission spectrum of Al using different spectral: hallow-cathode lamp and Nd: YAG laser produced Al plasma. With these spectra we have determined different atomic parameters like transition probabilities and electron temperatures.(author). 83 refs

  6. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    Science.gov (United States)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  7. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  8. Everyday episodic memory in amnestic mild cognitive impairment: a preliminary investigation.

    Science.gov (United States)

    Irish, Muireann; Lawlor, Brian A; Coen, Robert F; O'Mara, Shane M

    2011-08-04

    Decline in episodic memory is one of the hallmark features of Alzheimer's disease (AD) and is also a defining feature of amnestic Mild Cognitive Impairment (MCI), which is posited as a potential prodrome of AD. While deficits in episodic memory are well documented in MCI, the nature of this impairment remains relatively under-researched, particularly for those domains with direct relevance and meaning for the patient's daily life. In order to fully explore the impact of disruption to the episodic memory system on everyday memory in MCI, we examined participants' episodic memory capacity using a battery of experimental tasks with real-world relevance. We investigated episodic acquisition and delayed recall (story-memory), associative memory (face-name pairings), spatial memory (route learning and recall), and memory for everyday mundane events in 16 amnestic MCI and 18 control participants. Furthermore, we followed MCI participants longitudinally to gain preliminary evidence regarding the possible predictive efficacy of these real-world episodic memory tasks for subsequent conversion to AD. The most discriminating tests at baseline were measures of acquisition, delayed recall, and associative memory, followed by everyday memory, and spatial memory tasks, with MCI patients scoring significantly lower than controls. At follow-up (mean time elapsed: 22.4 months), 6 MCI cases had progressed to clinically probable AD. Exploratory logistic regression analyses revealed that delayed associative memory performance at baseline was a potential predictor of subsequent conversion to AD. As a preliminary study, our findings suggest that simple associative memory paradigms with real-world relevance represent an important line of enquiry in future longitudinal studies charting MCI progression over time.

  9. Everyday episodic memory in amnestic mild cognitive impairment: a preliminary investigation

    Directory of Open Access Journals (Sweden)

    Lawlor Brian A

    2011-08-01

    Full Text Available Abstract Background Decline in episodic memory is one of the hallmark features of Alzheimer's disease (AD and is also a defining feature of amnestic Mild Cognitive Impairment (MCI, which is posited as a potential prodrome of AD. While deficits in episodic memory are well documented in MCI, the nature of this impairment remains relatively under-researched, particularly for those domains with direct relevance and meaning for the patient's daily life. In order to fully explore the impact of disruption to the episodic memory system on everyday memory in MCI, we examined participants' episodic memory capacity using a battery of experimental tasks with real-world relevance. We investigated episodic acquisition and delayed recall (story-memory, associative memory (face-name pairings, spatial memory (route learning and recall, and memory for everyday mundane events in 16 amnestic MCI and 18 control participants. Furthermore, we followed MCI participants longitudinally to gain preliminary evidence regarding the possible predictive efficacy of these real-world episodic memory tasks for subsequent conversion to AD. Results The most discriminating tests at baseline were measures of acquisition, delayed recall, and associative memory, followed by everyday memory, and spatial memory tasks, with MCI patients scoring significantly lower than controls. At follow-up (mean time elapsed: 22.4 months, 6 MCI cases had progressed to clinically probable AD. Exploratory logistic regression analyses revealed that delayed associative memory performance at baseline was a potential predictor of subsequent conversion to AD. Conclusions As a preliminary study, our findings suggest that simple associative memory paradigms with real-world relevance represent an important line of enquiry in future longitudinal studies charting MCI progression over time.

  10. Everyday episodic memory in amnestic Mild Cognitive Impairment: a preliminary investigation

    LENUS (Irish Health Repository)

    Irish, Muireann

    2011-08-04

    Abstract Background Decline in episodic memory is one of the hallmark features of Alzheimer\\'s disease (AD) and is also a defining feature of amnestic Mild Cognitive Impairment (MCI), which is posited as a potential prodrome of AD. While deficits in episodic memory are well documented in MCI, the nature of this impairment remains relatively under-researched, particularly for those domains with direct relevance and meaning for the patient\\'s daily life. In order to fully explore the impact of disruption to the episodic memory system on everyday memory in MCI, we examined participants\\' episodic memory capacity using a battery of experimental tasks with real-world relevance. We investigated episodic acquisition and delayed recall (story-memory), associative memory (face-name pairings), spatial memory (route learning and recall), and memory for everyday mundane events in 16 amnestic MCI and 18 control participants. Furthermore, we followed MCI participants longitudinally to gain preliminary evidence regarding the possible predictive efficacy of these real-world episodic memory tasks for subsequent conversion to AD. Results The most discriminating tests at baseline were measures of acquisition, delayed recall, and associative memory, followed by everyday memory, and spatial memory tasks, with MCI patients scoring significantly lower than controls. At follow-up (mean time elapsed: 22.4 months), 6 MCI cases had progressed to clinically probable AD. Exploratory logistic regression analyses revealed that delayed associative memory performance at baseline was a potential predictor of subsequent conversion to AD. Conclusions As a preliminary study, our findings suggest that simple associative memory paradigms with real-world relevance represent an important line of enquiry in future longitudinal studies charting MCI progression over time.

  11. Cyclonic valve test: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Monteiro, Andre Sampaio; Moraes, Carlos Alberto C.; Marins, Luiz Philipe M.; Soares, Fabricio; Oliveira, Dennis; Lima, Fabio Soares de; Airao, Vinicius [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil); Ton, Tijmen [Twister BV, Rijswijk (Netherlands)

    2012-07-01

    For many years, the petroleum industry has been developing a valve that input less shear to the flow for a given required pressure drop and this can be done using the cyclonic concept. This paper presents a comparison between the performances of a cyclonic valve (low shear) and a conventional globe valve. The aim of this work is to show the advantages of using a cyclonic low shear valve instead of the commonly used in the primary separation process by PETROBRAS. Tests were performed at PETROBRAS Experimental Center (NUEX) in Aracaju/SE varying some parameters: water cut; pressure loss (from 4 kgf/cm2 to 10 kgf/cm2); flow rates (30 m3/h and 45 m3/h). Results indicates a better performance of the cyclonic valve, if compared with a conventional one, and also that the difference of the performance, is a function of several parameters (emulsion stability, water content free, and oil properties). The cyclonic valve tested can be applied as a choke valve, as a valve between separation stages (for pressure drop), or for controlling the level of vessels. We must emphasize the importance to avoid the high shear imposed by conventional valves, because once the emulsion is created, it becomes more difficult to break it. New tests are being planned to occur in 2012, but PETROBRAS is also analyzing real cases where the applications could increase the primary process efficiency. In the same way, the future installations are also being designed considering the cyclonic valve usage. (author)

  12. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  13. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  14. Improved Membership Probability for Moving Groups: Bayesian and Machine Learning Approaches

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-01-01

    Gravitationally unbound loose stellar associations (i.e., young nearby moving groups: moving groups hereafter) have been intensively explored because they are important in planet and disk formation studies, exoplanet imaging, and age calibration. Among the many efforts devoted to the search for moving group members, a Bayesian approach (e.g.,using the code BANYAN) has become popular recently because of the many advantages it offers. However, the resultant membership probability needs to be carefully adopted because of its sensitive dependence on input models. In this study, we have developed an improved membership calculation tool focusing on the beta-Pic moving group. We made three improvements for building models used in BANYAN II: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZUVW. Our improved tool can change membership probability up to 70%. Membership probability is critical and must be better defined. For example, our code identifies only one third of the candidate members in SIMBAD that are believed to be kinematically associated with beta-Pic moving group.Additionally, we performed cluster analysis of young nearby stars using an unsupervised machine learning approach. As more moving groups and their members are identified, the complexity and ambiguity in moving group configuration has been increased. To clarify this issue, we analyzed ~4,000 X-ray bright young stellar candidates. Here, we present the preliminary results. By re-identifying moving groups with the least human intervention, we expect to understand the composition of the solar neighborhood. Moreover better defined moving group membership will help us understand star formation and evolution in relatively low density environments; especially for the low-mass stars which will be identified in the coming Gaia release.

  15. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  16. Estimation of the defect detection probability for ultrasonic tests on thick sections steel weldments. Technical report

    International Nuclear Information System (INIS)

    Johnson, D.P.; Toomay, T.L.; Davis, C.S.

    1979-02-01

    An inspection uncertainty analysis of published PVRC Specimen 201 data is reported to obtain an estimate of the probability of recording an indication as a function of imperfection height for ASME Section XI Code ultrasonic inspections of the nuclear reactor vessel plate seams and to demonstrate the advantages of inspection uncertainty analysis over conventional detection/nondetection counting analysis. This analysis found the probability of recording a significant defect with an ASME Section XI Code ultrasonic inspection to be very high, if such a defect should exist in the plate seams of a nuclear reactor vessel. For a one-inch high crack, for example, this analysis gives a best estimate recording probability of .985 and a 90% lower confidence bound recording probabilty of .937. It is also shown that inspection uncertainty analysis gives more accurate estimates and gives estimates over a much greater flaw size range than is possible with conventional analysis. There is reason to believe that the estimation procedure used is conservative, the estimation is based on data generated several years ago, on very small defects, in an environment that is different from the actual in-service inspection environment

  17. Snell Envelope with Small Probability Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Del Moral, Pierre, E-mail: Pierre.Del-Moral@inria.fr; Hu, Peng, E-mail: Peng.Hu@inria.fr [Universite de Bordeaux I, Centre INRIA Bordeaux et Sud-Ouest and Institut de Mathematiques de Bordeaux (France); Oudjane, Nadia, E-mail: Nadia.Oudjane@edf.fr [EDF R and D Clamart (France)

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  18. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  19. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  20. Combination of probabilities in looking for cosmic ray sources

    International Nuclear Information System (INIS)

    Goodman, M.

    1991-08-01

    The use of small chance probabilities as evidence for sources of cosmic rays is examined, with particular emphasis upon issues involved when combining results from two experiments, two analyses, or two independent tests of the same data. Examples are given in which different methods of combining results should be used

  1. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  2. Test plan for long-term, low-temperature oxidation of spent fuel, Series 1

    International Nuclear Information System (INIS)

    Einziger, R.E.

    1986-06-01

    Preliminary studies indicated the need for more spent fuel oxidation data in order to determine the probable behavior of spent fuel in a tuff repository. Long-term, low-temperature testing was recommended in a comprehensive technical approach to: (1) confirm the findings of the short-term thermogravimetric analyses scoping experiments; (2) evaluate the effects of variables such as burnup, atmospheric moisture and fuel type on the oxidation rate; and (3) extend the oxidation data base ot representative repository temperatures and better define the temperature dependence of the operative oxidation mechanisms. This document presents the Series 1 test plan to study, on a large number of samples, the effects of atmospheric moisture and temperature on oxidation rate and phase formation. Tests will run for up to two years, use characterized fragmented, and pulverized fuel samples, cover a temperature range of 110 0 C to 175 0 C and be conducted with an atmospheric moisture content rangeing from 0 C to approx. 80 0 C dew point. After testing, the samples will be examined and made available for leaching testing

  3. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  4. Probability of acoustic transmitter detections by receiver lines in Lake Huron: results of multi-year field tests and simulations

    Science.gov (United States)

    Hayden, Todd A.; Holbrook, Christopher M.; Binder, Thomas; Dettmers, John M.; Cooke, Steven J.; Vandergoot, Christopher S.; Krueger, Charles C.

    2016-01-01

    BackgroundAdvances in acoustic telemetry technology have led to an improved understanding of the spatial ecology of many freshwater and marine fish species. Understanding the performance of acoustic receivers is necessary to distinguish between tagged fish that may have been present but not detected and from those fish that were absent from the area. In this study, two stationary acoustic transmitters were deployed 250 m apart within each of four acoustic receiver lines each containing at least 10 receivers (i.e., eight acoustic transmitters) located in Saginaw Bay and central Lake Huron for nearly 2 years to determine whether the probability of detecting an acoustic transmission varied as a function of time (i.e., season), location, and distance between acoustic transmitter and receiver. Distances between acoustic transmitters and receivers ranged from 200 m to >10 km in each line. The daily observed probability of detecting an acoustic transmission was used in simulation models to estimate the probability of detecting a moving acoustic transmitter on a line of receivers.ResultsThe probability of detecting an acoustic transmitter on a receiver 1000 m away differed by month for different receiver lines in Lake Huron and Saginaw Bay but was similar for paired acoustic transmitters deployed 250 m apart within the same line. Mean probability of detecting an acoustic transmitter at 1000 m calculated over the study period varied among acoustic transmitters 250 m apart within a line and differed among receiver lines in Lake Huron and Saginaw Bay. The simulated probability of detecting a moving acoustic transmitter on a receiver line was characterized by short periods of time with decreased detection. Although increased receiver spacing and higher fish movement rates decreased simulated detection probability, the location of the simulated receiver line in Lake Huron had the strongest effect on simulated detection probability.ConclusionsPerformance of receiver

  5. Estimated probability of stroke among medical outpatients in Enugu ...

    African Journals Online (AJOL)

    Risk factors for stroke were evaluated using a series of laboratory tests, medical history and physical examinations. The 10‑year probability of stroke was determined by applying the Framingham stroke risk equation. Statistical analysis was performed with the use of the SPSS 17.0 software package (SPSS Inc., Chicago, IL, ...

  6. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  7. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  8. C.O.D. toughness testing of medium strength steel as a preliminary development for single specimen J integral toughness tests of SA533-B steel

    International Nuclear Information System (INIS)

    Dean, P.; Tait, R.B.; Garrett, G.G.

    1981-10-01

    The primary purpose of this project is to set up a test facility and to develop the necessary expertise to enable reliable elasto-plastic fracture toughness tests to be performed. Initially, tests are to be conducted on material similar to that used in the Koeberg pressure vessel walls, with the ultimate goal of performing single specimen J integral tests on the pressure vessel steel itself to determine through-thickness toughness variations. The project will comprise a number of stages, each one necessary for the development of the techniques used in J integral testing. These include: (i) development of an appropriate specimen design, of suitable size and shape that is applicable to both crack opening displacement (C.O.D.) and J integral tests; (ii) development, testing and calibration of the necessary associated mechanical and electrical equipment (e.g. clip gauge, amplifiers, interface unit, etc.), with (iii) an estimation of the probable errors and noise levels with a view to their elimantion, leading to (iv) perfection of the sensitivity and reproducibility of, firstly, the multiple specimen C.O.D. technique and, secondly, the multiple specimen J integral techniques. (v) Based on the above techniques, development of the single specimen J integral test method incorporating development of a computerised testing procedure. All the above procedure is to be conducted on similar, but non-Koeberg pressure vessel material ('ROQ Tough'). (vi) Finally, development and testing of both multiple specimen and single specimen J integral tests on actual SA533B material and an investigation of the through thickness toughness and fatigue crack propagation behaviour

  9. A preliminary evaluation of influence of body mass index on in vitro fertilization outcome in non-obese endometriosis patients.

    Science.gov (United States)

    Garalejic, Eliana; Arsic, Biljana; Radakovic, Jovana; Bojovic Jovic, Dragana; Lekic, Dragana; Macanovic, Biljana; Soldatovic, Ivan; Perovic, Milan

    2017-11-16

    Obese and overweight women experience a lower probability for pregnancy after IVF. However, despite the increasing prevalence of obesity, the large majority of infertile women are non-obese. One of the most common indications for IVF is endometriosis. Thought-provoking inverse correlation has been established between BMI and endometriosis. Lower BMI is a risk factor for development of endometriosis and a predictive factor for severe endometriosis. Since severe endometriosis carries lower reproductive chances, even after IVF, we preliminary tested a hypothesis that higher BMI among non-obese endometriosis patients improves IVF outcomes. Preliminary retrospective observational cross-sectional study was performed in women with endometriosis as a sole infertility cause who underwent IVF. During analyzed period we performed 2782 IVF procedures. In order to achieve highly homogenous study sample and to eliminate almost all confound factors that could lead to bias, we implemented strict study criteria. The number of eligible subjects was 156 and they were divided into underweight, normal weight and overweight groups. Primary outcomes were number of retrieved oocytes, good quality oocytes, embryos, and the rates of biochemical, clinical and ongoing pregnancies. For group comparisons, we used parametric test, analysis of variance, and non-parametric tests (Kruskal-Wallis test, Chi-square test). Logistic regression and General linear model was used to assess correlation between BMI and dependent variables (outcome and stimulation duration) when adjusted for age. Endometriosis as a single infertility factor among IVF couples had prevalence of 5.61%. Underweight women accounted for 10.26%, normal weight 71.15% and overweight 18.59% of study population. Significant differences were not found in number of retrieved oocytes (p = 0.880), good quality oocytes (p = 0.476), obtained embryos (p = 0.706), and biochemical (p = 0.298), clinical (p = 0.770) and ongoing (p = 0

  10. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  11. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  12. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  13. Danish Anaesthesia Allergy Centre - preliminary results

    DEFF Research Database (Denmark)

    Garvey, L H; Roed-Petersen, J; Menné, T

    2001-01-01

    BACKGROUND: Anaphylactoid reactions in anaesthesia are rare and should ideally be investigated in specialist centres. At Gentofte University Hospital, we established such a centre in 1998 as a joint venture between the Departments of Anaesthesiology and Dermatology. We present the methodology...... for chlorhexidine. Only one patient has tested positive to a neuromuscular blocking drug (NMBD) so far. DISCUSSION: Our preliminary results appear to differ in two ways from results usually found in this field. Firstly, only one patient has tested positive for a NMBD and secondly, we have had four patients...

  14. Danish Anaesthesia Allergy Centre - preliminary results

    DEFF Research Database (Denmark)

    Garvey, L H; Roed-Petersen, J; Menné, T

    2001-01-01

    BACKGROUND: Anaphylactoid reactions in anaesthesia are rare and should ideally be investigated in specialist centres. At Gentofte University Hospital, we established such a centre in 1998 as a joint venture between the Departments of Anaesthesiology and Dermatology. We present the methodology...... of in vitro testing and skin testing. Blood samples for tryptase analysis are taken at the time of reaction and a control sample is taken together with samples for specific IgE analysis 2-4 weeks after the reaction. Subsequent skin testing comprises both prick tests and intradermal tests in most cases...... for chlorhexidine. Only one patient has tested positive to a neuromuscular blocking drug (NMBD) so far. DISCUSSION: Our preliminary results appear to differ in two ways from results usually found in this field. Firstly, only one patient has tested positive for a NMBD and secondly, we have had four patients...

  15. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  16. Application of miniaturized disk bend test technique for selection of optimum composition of candidate materials for fusion reactors

    International Nuclear Information System (INIS)

    Tsepelev, A.B.; Poymenov, I.L.

    1992-01-01

    An analysis of the potential of a miniaturized disk bend test (MDBT) technique for estimation of irradiated steel mechanical properties behaviour indicates promise in selecting candidate materials for nuclear applications. The advantages of the method are most clearly demonstrated when a large series of tests is needed. The tiny specimen size gives an additional advantage from the point of view of radiation material science. As an example of the MDBT potential, preliminary results of electron irradiation effects on Cr-Mn-W austenitic and Cr-W ferrite carbon and nitrogen steels are presented. It is shown that electron irradiation causes changes of the loading MDBT-curve form of the steels that most probably are connected with radiation-induced structure-phase transformations in the steels. (orig.)

  17. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  18. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  19. Probability encoding of hydrologic parameters for basalt: Elicitation of expert opinions from a panel of five consulting hydrologists

    International Nuclear Information System (INIS)

    Davis, J.D.

    1984-01-01

    The Columbia River Basalts Underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are the effective porosity of the Cohassett flow top and flow interior and the vertical-to-horizontal hydraulic conductivity, or anisotropy ratio, of the Cohassett flow interior. Site-specific data for these hydrologic parameters are currently inadequate. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters for the purpose of preliminary assessment of candidate repository performance, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. 210 refs., 12 figs., 5 tabs

  20. A soft wearable robot for the shoulder: Design, characterization, and preliminary testing.

    Science.gov (United States)

    O'Neill, Ciaran T; Phipps, Nathan S; Cappello, Leonardo; Paganoni, Sabrina; Walsh, Conor J

    2017-07-01

    In this paper, we present a soft wearable robot for the shoulder which has the potential to assist individuals suffering from a range of neuromuscular conditions affecting the shoulder to perform activities of daily living. This wearable robot combines two types of soft textile pneumatic actuators which were custom developed for this particular application to support the upper arm through shoulder abduction and horizontal flexion/extension. The advantage of a textile-based approach is that the robot can be lightweight, low-profile, comfortable and non-restrictive to the wearer, and easy to don like an item of clothing. The actuator's ability to fold flat when not in use allows the robot to be almost invisible under clothing, potentially allowing the user to avoid any stigma associated with using assistive devices in public. To abduct the arm, a textilebased pneumatic actuator was developed to fit within the axilla to push the arm upwards, while a pair of smaller actuators pivot the abduction actuator to allow for horizontal extension and flexion. The individual textile actuators were experimentally evaluated before being integrated into a wearable garment. Human subject testing was performed to evaluate the ability of the robot to assist the arm by monitoring changes in biological muscle activity when comparing the robot powered on and off. Preliminary results show large reductions in muscular effort in targeted muscles, demonstrating the feasibility and promise of such a soft wearable robot for the shoulder.

  1. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  2. Infrasonic Detection of a Large Bolide over South Sulawesi, Indonesia on October 8, 2009: Preliminary Results

    Science.gov (United States)

    Silber, E. A.; Brown, P. G.; Le Pinchon, A.

    2011-01-01

    In the morning hours of October 8, 2009, a bright object entered Earth's atmosphere over South Sulawesi, Indonesia. This bolide disintegrated above the ground, generating stratospheric infrasound returns that were detected by infrasonic stations of the global International Monitoring System (IMS) Network of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) at distances up to 17 500 km. Here we present instrumental recordings and preliminary results of this extraordinary event. Using the infrasonic period-yield relations, originally derived for atmospheric nuclear detonations, we find the most probable source energy for this bolide to be 70+/-20 kt TNT equivalent explosive yield. A unique aspect of this event is the fact that it was apparently detected by infrasound only. Global events of such magnitude are expected only once per decade and can be utilized to calibrate infrasonic location and propagation tools on a global scale, and to evaluate energy yield formula, and event timing.

  3. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  4. Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Misiūnė, Ieva

    2015-04-01

    Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 ptested technique. Simple Kriging, DK, IK and PK methods identified the high SWR probabilities in the northeast and central part of the plot, while OK observed mainly in the south-western part of the plot. In conclusion, before predict the spatial probability of SWR it is important to test several methods in order to identify the most accurate. Acknowledgments COST action ES

  5. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  7. Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences

    DEFF Research Database (Denmark)

    Hansen, Søren; Blanke, Mogens

    2013-01-01

    desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....

  8. Energy Efficient Engine: Control system preliminary definition report

    Science.gov (United States)

    Howe, David C.

    1986-01-01

    The object of the Control Preliminary Definition Program was to define a preliminary control system concept as a part of the Energy Efficient Engine program. The program was limited to a conceptual definition of a full authority digital electronic control system. System requirements were determined and a control system was conceptually defined to these requirements. Areas requiring technological development were identified and a plan was established for implementing the identified technological features, including a control technology demonstration. A significant element of this program was a study of the potential benefits of closed-loop active clearance control, along with laboratory tests of candidate clearance sensor elements for a closed loop system.

  9. Preliminary design package for prototype solar heating system

    Energy Technology Data Exchange (ETDEWEB)

    1978-12-01

    A summary is given of the preliminary analysis and design activity on solar heating systems. The analysis was made without site specific ata other than weather; therefore, the results indicate performance expected under these special conditions. Major items in this report include systeem candidates, design approaches, trade studies and other special data required to evaluate the preliminary analysis and design. The program calls for the development and delivery of eight prototype solar heating and coolin systems for installation and operational test. Two-heating and six heating and cooling units will be delivered for Single Family Residences (SFR), Multi-Family Residences (MFR) and commercial applications.

  10. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  11. Preliminary design for a maglev development facility

    Energy Technology Data Exchange (ETDEWEB)

    Coffey, H.T.; He, J.L.; Chang, S.L.; Bouillard, J.X.; Chen, S.S.; Cai, Y.; Hoppie, L.O.; Lottes, S.A.; Rote, D.M. (Argonne National Lab., IL (United States)); Zhang, Z.Y. (Polytechnic Univ., Brooklyn, NY (United States)); Myers, G.; Cvercko, A. (Sterling Engineering, Westchester, IL (United States)); Williams, J.R. (Alfred Benesch and Co., Chicago, IL (United States))

    1992-04-01

    A preliminary design was made of a national user facility for evaluating magnetic-levitation (maglev) technologies in sizes intermediate between laboratory experiments and full-scale systems. A technical advisory committee was established and a conference was held to obtain advice on the potential requirements of operational systems and how the facility might best be configured to test these requirements. The effort included studies of multiple concepts for levitating, guiding, and propelling maglev vehicles, as well as the controls, communications, and data-acquisition and -reduction equipment that would be required in operating the facility. Preliminary designs for versatile, dual 2-MVA power supplies capable of powering attractive or repulsive systems were developed. Facility site requirements were identified. Test vehicles would be about 7.4 m (25 ft) long, would weigh form 3 to 7 metric tons, and would operate at speeds up to 67 m/s (150 mph) on a 3.3-km (2.05-mi) elevated guideway. The facility would utilize modular vehicles and guideways, permitting the substitution of levitation, propulsion, and guideway components of different designs and materials for evaluation. The vehicle would provide a test cell in which individual suspension or propulsion components or subsystems could be tested under realistic conditions. The system would allow economical evaluation of integrated systems under varying weather conditions and in realistic geometries.

  12. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    In May and June 2012, the Whitewater-Baldy Fire burned approximately 1,200 square kilometers (300,000 acres) of the Gila National Forest, in southwestern New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 128 basins burned by the Whitewater-Baldy Fire. A pair of empirical hazard-assessment models developed by using data from recently burned basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and for selected drainage basins within the burned area. The models incorporate measures of areal burned extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. In response to the 2-year-recurrence, 30-minute-duration rainfall, modeling indicated that four basins have high probabilities of debris-flow occurrence (greater than or equal to 80 percent). For the 10-year-recurrence, 30-minute-duration rainfall, an additional 14 basins are included, and for the 25-year-recurrence, 30-minute-duration rainfall, an additional eight basins, 20 percent of the total, have high probabilities of debris-flow occurrence. In addition, probability analysis along the stream segments can identify specific reaches of greatest concern for debris flows within a basin. Basins with a high probability of debris-flow occurrence were concentrated in the west and central parts of the burned area, including tributaries to Whitewater Creek, Mineral Creek, and Willow Creek. Estimated debris-flow volumes ranged from about 3,000-4,000 cubic meters (m3) to greater than 500,000 m3 for all design storms modeled. Drainage basins with estimated volumes greater than 500,000 m3 included tributaries to Whitewater Creek, Willow

  13. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  14. Preliminary Examination of Particles Recovered from the Surface of the Asteroid Itokawa by the Hayabusa Mission

    Science.gov (United States)

    Tsuchiyama, A.; Ebihara, M.; Kimura, M.; Kitajima, F.; Kotsugi, M.; Ito, S.; Nagao, K.; Nakamura, T.; Naraoka, H.; Noguchi, T.; hide

    2011-01-01

    The Hayabusa spacecraft arrived at S-type Asteroid 25143 Itokawa in November 2006, and reveal astounding features of the small asteroid (535 x 294 x 209 m). Near-infrared spectral shape indicates that the surface of this body has an olivinerich mineral assemblage potentially similar to that of LL5 or LL6 chondrites with different degrees of space weathering. Based on the surface morphological features observed in high-resolution images of Itokawa s surface, two major types of boulders were distinguished: rounded and angular boulders. Rounded boulders seem to be breccias, while angular boulders seem to have severe impact origin. Although the sample collection did not be made by normal operations, it was considered that some amount of samples, probably small particles of regolith, was collected from MUSES-C regio on the Itokawa s surface. The sample capsule was successfully recovered on the earth on June 13, 2010, and was opened at curation facility of JAXA (Japan Aerospace Exploration Agency), Sagamihara, Japan. A large number of small particles were found in the sample container. Preliminary analysis with SEM/EDX at the curation facility showed that at least more than 1500 grains were identified as rocky particles, and most of them were judged to be of extraterrestrial origin, and definitely from Asteroid Itokawa. Minerals (olivine, low-Ca pyroxene, high-Ca pyroxene, plagioclase, Fe sulfide, Fe-Ni metal, chromite, Ca phosphate), roughly estimated mode the minerals and rough measurement of the chemical compositions of the silicates show that these particles are roughly similar to LL chondrites. Although their size are mostly less than 10 m, some larger particles of about 100 m or larger were also identified. A part of the sample (probably several tens particles) will be selected by Hayabusa sample curation team and examined preliminary in Japan within one year after the sample recovery in prior to detailed analysis phase. Hayabusa Asteroidal Sample Preliminary

  15. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    Science.gov (United States)

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.

  16. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  17. Development of e-Juba, a preliminary proof of concept unmanned ...

    African Journals Online (AJOL)

    Development of e-Juba, a preliminary proof of concept unmanned aerial vehicle designed to facilitate the transportation of microbiological test samples from remote rural clinics to National Health Laboratory Service laboratories.

  18. Sensitivity and bias in decision-making under risk: evaluating the perception of reward, its probability and value.

    Directory of Open Access Journals (Sweden)

    Madeleine E Sharp

    Full Text Available BACKGROUND: There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. OBJECTIVE: We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. DESIGN/METHODS: Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. RESULTS: Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a 'risk premium' of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. CONCLUSIONS: This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia.

  19. Sensitivity and Bias in Decision-Making under Risk: Evaluating the Perception of Reward, Its Probability and Value

    Science.gov (United States)

    Sharp, Madeleine E.; Viswanathan, Jayalakshmi; Lanyon, Linda J.; Barton, Jason J. S.

    2012-01-01

    Background There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. Objective We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. Design/Methods Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. Results Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a ‘risk premium’ of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. Conclusions This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia. PMID:22493669

  20. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  1. Preliminary screening of plant essential oils against larvae of Culex ...

    African Journals Online (AJOL)

    Preliminary screenings of 22 plant essential oils were tested for mortality of the mosquito larvae Culex quinquefasciatus under laboratory conditions. Percent (%) mortality of the mosquito larvae were obtained for each essential oil. At different exposure periods, viz. 1, 3, 6, 12 and 24 h among the 22 plant oils tested, eight ...

  2. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  4. Preliminary identification of problem soils for infrastructure projects

    CSIR Research Space (South Africa)

    Paige-Green, P

    2008-11-01

    Full Text Available soils are those within the top 1.0m or 1.5m of the soil profile, a mechanism for evaluating these materials without preliminary filed work and testing would be invaluable. Since 1971, the Department of Agriculture has systematically mapped the soils...

  5. Design of a Tablet Computer App for Facilitation of a Molecular Blood Culture Test in Clinical Microbiology and Preliminary Usability Evaluation.

    Science.gov (United States)

    Samson, Lasse L; Pape-Haugaard, Louise; Meltzer, Michelle C; Fuchs, Martin; Schønheyder, Henrik C; Hejlesen, Ole

    2016-03-18

    User mobility is an important aspect of the development of clinical information systems for health care professionals. Mobile phones and tablet computers have obtained widespread use by health care professionals, offering an opportunity for supporting the access to patient information through specialized applications (apps) while supporting the mobility of the users. The use of apps for mobile phones and tablet computers may support workflow of complex tasks, for example, molecular-based diagnostic tests in clinical microbiology. Multiplex Blood Culture Test (MuxBCT) is a molecular-based diagnostic test used for rapid identification of pathogens in positive blood cultures. To facilitate the workflow of the MuxBCT, a specialized tablet computer app was developed as an accessory to the diagnostic test. The app aims to reduce the complexity of the test by step-by-step guidance of microscopy and to assist users in reaching an exact bacterial or fungal diagnosis based on blood specimen observations and controls. Additionally, the app allows for entry of test results, and communication thereof to the laboratory information system (LIS). The objective of the study was to describe the design considerations of the MuxBCT app and the results of a preliminary usability evaluation. The MuxBCT tablet app was developed and set up for use in a clinical microbiology laboratory. A near-live simulation study was conducted in the clinical microbiology laboratory to evaluate the usability of the MuxBCT app. The study was designed to achieve a high degree of realism as participants carried out a scenario representing the context of use for the MuxBCT app. As the MuxBCT was under development, the scenario involved the use of molecular blood culture tests similar to the MuxBCT for identification of microorganisms from positive blood culture samples. The study participants were observed, and their interactions with the app were recorded. After the study, the participants were debriefed to

  6. An assessment of the preliminary microbiological studies on clay cores from Elstow conducted by the Universities of Leicester and Warwick

    International Nuclear Information System (INIS)

    Rushbrook, P.E.

    1987-04-01

    Two preliminary studies were conducted to establish the presence or absence of micro-organisms in samples of Oxford Clay from Elstow. Bacterial colonies were the dominant organisms found. In addition, a limited series of tests was performed to assess the tolerance of specific bacterial sub-cultures to variations in two environmental parameters; pH and temperature. It was found that micro-organisms can survive and colonise at depth in Oxford Clay strata, but at an abundance of between two and four orders of magnitude below that of typical garden soil. Therefore, in the ''after closure'' phase of a repository where an environment of pH greater than 9 will probably exist, some localised micro-biological action may take place. However, from the data obtained, degradation by this mechanism is likely to be slower when compared to deterioration from physical and chemical mechanisms. (author)

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  9. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  10. Preliminary failure modes and effects analysis on Korean HCCR TBS to be tested in ITER

    International Nuclear Information System (INIS)

    Ahn, Mu-Young; Cho, Seungyon; Jin, Hyung Gon; Lee, Dong Won; Park, Yi-Hyun; Lee, Youngmin

    2015-01-01

    Highlights: • Postulated initiating events are identified through failure modes and effects analysis on the current HCCR TBS design. • A set of postulated initiating events are selected for consideration of deterministic analysis. • Accident evolutions on the selected postualted initiating events are qualitatively described for deterministic analysis. - Abstract: Korean Helium cooled ceramic reflector (HCCR) Test blanket system (TBS), which comprises Test blanket module (TBM) and ancillary systems in various locations of ITER building, is operated at high temperature and pressure with decay heat. Therefore, safety is utmost concern in design process and it is required to demonstrate that the HCCR TBS is designed to comply with the safety requirements and guidelines of ITER. Due to complexity of the system with many interfaces with ITER, a systematic approach is necessary for safety analysis. This paper presents preliminary failure modes and effects analysis (FMEA) study performed for the HCCR TBS. FMEA is a systematic methodology in which failure modes for components in the system and their consequences are studied from the bottom-up. Over eighty failure modes have been investigated on the HCCR TBS. The failure modes that have similar consequences are grouped as postulated initiating events (PIEs) and total seven reference accident scenarios are derived from FMEA study for deterministic accident analysis. Failure modes not covered here due to evolving design of the HCCR TBS and uncertainty in maintenance procedures will be studied further in near future.

  11. Preliminary failure modes and effects analysis on Korean HCCR TBS to be tested in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Mu-Young, E-mail: myahn74@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Yi-Hyun; Lee, Youngmin [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Postulated initiating events are identified through failure modes and effects analysis on the current HCCR TBS design. • A set of postulated initiating events are selected for consideration of deterministic analysis. • Accident evolutions on the selected postualted initiating events are qualitatively described for deterministic analysis. - Abstract: Korean Helium cooled ceramic reflector (HCCR) Test blanket system (TBS), which comprises Test blanket module (TBM) and ancillary systems in various locations of ITER building, is operated at high temperature and pressure with decay heat. Therefore, safety is utmost concern in design process and it is required to demonstrate that the HCCR TBS is designed to comply with the safety requirements and guidelines of ITER. Due to complexity of the system with many interfaces with ITER, a systematic approach is necessary for safety analysis. This paper presents preliminary failure modes and effects analysis (FMEA) study performed for the HCCR TBS. FMEA is a systematic methodology in which failure modes for components in the system and their consequences are studied from the bottom-up. Over eighty failure modes have been investigated on the HCCR TBS. The failure modes that have similar consequences are grouped as postulated initiating events (PIEs) and total seven reference accident scenarios are derived from FMEA study for deterministic accident analysis. Failure modes not covered here due to evolving design of the HCCR TBS and uncertainty in maintenance procedures will be studied further in near future.

  12. A fluctuation relation for the probability of energy backscatter

    Science.gov (United States)

    Vela-Martin, Alberto; Jimenez, Javier

    2017-11-01

    We simulate the large scales of an inviscid turbulent flow in a triply periodic box using a dynamic Smagorinsky model for the sub-grid stresses. The flow, which is forced to constant kinetic energy, is fully reversible and can develop a sustained inverse energy cascade. However, due to the large number of degrees freedom, the probability of spontaneous mean inverse energy flux is negligible. In order to quantify the probability of inverse energy cascades, we test a local fluctuation relation of the form log P(A) = - c(V , t) A , where P(A) = p(| Cs|V,t = A) / p(| Cs|V , t = - A) , p is probability, and | Cs|V,t is the average of the least-squared dynamic model coefficient over volume V and time t. This is confirmed when Cs is averaged over sufficiently large domains and long times, and c is found to depend linearly on V and t. In the limit in which V 1 / 3 is of the order of the integral scale and t is of the order of the eddy-turnover time, we recover a global fluctuation relation that predicts a negligible probability of a sustained inverse energy cascade. For smaller V and t, the local fluctuation relation provides useful predictions on the occurrence of local energy backscatter. Funded by the ERC COTURB project.

  13. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  15. Preliminary experimental results of tungsten wire-array Z-pinches on primary test stand

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xian-Bin; Zhou, Shao-Tong; Dan, Jia-Kun; Ren, Xiao-Dong, E-mail: amosrxd@163.com; Wang, Kun-Lun; Zhang, Si-Qun; Li, Jing; Xu, Qiang; Cai, Hong-Chun; Duan, Shu-Chao; Ouyang, Kai; Chen, Guang-Hua; Ji, Ce; Wei, Bing; Feng, Shu-Ping; Wang, Meng; Xie, Wei-Ping; Deng, Jian-Jun [Key Laboratory of Pulsed Power, Institute of Fluid Physics, China Academy of Engineering Physics, P.O. Box 919-108, Mianyang, Sichuan 621999 (China); Zhou, Xiu-Wen; Yang, Yi [Research Center of Laser Fusion, China Academy of Engineering Physics, P.O. Box 919-987, Mianyang, Sichuan 621999 (China)

    2015-07-15

    The Primary Test Stand (PTS) developed at the China Academy of Engineering Physics is a 20 TW pulsed power driver, which can deliver a ∼10 MA, 70 ns rise-time (10%–90%) current to a short-circuit load and has important applications in Z-pinch driven inertial confinement fusion and high energy density physics. Preliminary results of tungsten wire-array Z-pinch experiments on PTS are presented. The load geometries investigated include 15-mm-tall cylindrical single and nested arrays with diameter ranging from 13 mm to 30 mm, consisting of 132–300 tungsten wires with 5–10 μm in diameter. Multiple diagnostics were fielded to characterize the x-ray radiation from wire-array Z pinches. The x-ray peak power (∼50 TW) and total radiated energy (∼500 kJ) were obtained from a single 20-mm-diam array with 80-ns stagnation time. The highest x-ray peak power up to 80 TW with 2.4 ns FWHM was achieved by using a nested array with 20-mm outer diameter, and the total x-ray energy from the nested array is comparable to that of single array. Implosion velocity estimated from the time-resolved image measurement exceeds 30 cm/μs. The detailed experimental results and other findings are presented and discussed.

  16. Preliminary Studies on the Use of Natural Fibers in Sustainable Concrete

    International Nuclear Information System (INIS)

    Awad, E.; Mabsout, M.; Hamad, B.; Khatib, H.

    2011-01-01

    The paper reports on preliminary tests performed to produce a sustainable 'green' concrete material using natural fibers such as industrial hemp, palm, and banana leaves fibers. Such material would increse the service life and reduce the life cost of the structure, and would have a positive effect on social life and social economy. The demand for the agricultural fibers for concrete production would be a major incentive to Lebanese farmers to benefit from the social impact on the habitat level of living. In the preliminary program reported in this paper, cubes and standard flexural beams were tested to evaluate the structural and physical performance of concrete mixes prepared with different volumetric ratios of added fibers and diffeent proportions of aggregates. Test results indicated that the case of natural fibers resulted in reducing the coarse aggregate quantity without affecting the flexural performance of concrete. However, no clear trend was determined in the cubes compressive strength test results.(author)

  17. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  18. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  19. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  20. A Preliminary Study on Cathodic Prevention in Reinforced Mortar

    NARCIS (Netherlands)

    Koleva, D.A.; Van Breugel, K.; Mol, J.M.C.; De Wit, J.H.W.

    2010-01-01

    This work presents the preliminary tests on the performance of cathodic prevention (CPre) in reinforced mortar, subjected to aggressive (10% NaCl environment). Cathodic prevention is an electrochemical technique for minimizing, actually "preventing" any eventual corrosion of the steel bars in

  1. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  2. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  4. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  5. Preliminary consideration for research on geological disposal of high-level radioactive waste in China in the period of 2000-2040

    International Nuclear Information System (INIS)

    Xu Guoqing

    2004-01-01

    Based on the overseas practical experiences with combination of domestic realistic conditions a preliminary consideration of a long-range plan is proposed for research on geological disposal of high-level radioactive waste in China in the period of 2000-2040. An overview of research on geological disposal of high-level radioactive waste in the overseas and mainland is presented shortly first in this paper. Then the discussion is centered on the preliminary consideration of a long-range plan for research on geological disposal of high-level radioactive waste in China. The partition of stages of research on geological disposal of high-level radioactive waste, the goal, task, research contents and time table for each research stage is stated in this preliminary consideration. The data mentioned above will probably be useful for making plan for geological disposal of high-level radioactive waste in the future in China. (author)

  6. Preliminary characterization of abandoned septic tank systems. Volume 1

    International Nuclear Information System (INIS)

    1995-12-01

    This report documents the activities and findings of the Phase I Preliminary Characterization of Abandoned Septic Tank Systems. The purpose of the preliminary characterization activity was to investigate the Tiger Team abandoned septic systems (tanks and associated leachfields) for the purpose of identifying waste streams for closure at a later date. The work performed was not to fully characterize or remediate the sites. The abandoned systems potentially received wastes or effluent from buildings which could have discharged non-domestic, petroleum hydrocarbons, hazardous, radioactive and/or mixed wastes. A total of 20 sites were investigated for the preliminary characterization of identified abandoned septic systems. Of the 20 sites, 19 were located and characterized through samples collected from each tank(s) and, where applicable, associated leachfields. The abandoned septic tank systems are located in Areas 5, 12, 15, 25, and 26 on the Nevada Test Site

  7. The researcher and the consultant: from testing to probability statements.

    Science.gov (United States)

    Hamra, Ghassan B; Stang, Andreas; Poole, Charles

    2015-09-01

    In the first instalment of this series, Stang and Poole provided an overview of Fisher significance testing (ST), Neyman-Pearson null hypothesis testing (NHT), and their unfortunate and unintended offspring, null hypothesis significance testing. In addition to elucidating the distinction between the first two and the evolution of the third, the authors alluded to alternative models of statistical inference; namely, Bayesian statistics. Bayesian inference has experienced a revival in recent decades, with many researchers advocating for its use as both a complement and an alternative to NHT and ST. This article will continue in the direction of the first instalment, providing practicing researchers with an introduction to Bayesian inference. Our work will draw on the examples and discussion of the previous dialogue.

  8. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  9. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  10. Biocontamination Control for Spacesuit Garments - A Preliminary Study

    Science.gov (United States)

    Rhodes, Richard A.; Orndoff, Evelyne; Korona, F. Adam; Poritz, Darwin; Smith, Jelanie; Wong, Wing

    2011-01-01

    This paper outlines a preliminary study that was conducted to review, test, and improve on current space suit biocontamination control. Biocontamination from crew members can cause space suit damage and objectionable odors and lead to crew member health hazards. An understanding of the level of biocontamination is necessary to mitigate its effects. A series of tests were conducted with the intent of evaluating current suit materials, ground and on-orbit disinfectants, and potential commercial off-the-shelf antimicrobial materials. Included in this paper is a discussion of the test methodology, results, and analysis method.

  11. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  12. Development and Preliminary Tests of an Open-Path Airborne Diode Laser Absorption Instrument for Carbon Dioxide

    Science.gov (United States)

    Diskin, Glenn S.; DiGangi, Joshua P.; Yang, Melissa; Slate, Thomas A.; Rana, Mario

    2015-01-01

    Carbon dioxide (CO2) is well known for its importance as an atmospheric greenhouse gas, with many sources and sinks around the globe. Understanding the fluxes of carbon into and out of the atmosphere is a complex and daunting challenge. One tool applied by scientists to measure the vertical flux of CO2 near the surface uses the eddy covariance technique, most often from towers but also from aircraft flying specific patterns over the study area. In this technique, variations of constituents of interest are correlated with fluctuations in the local vertical wind velocity. Measurement requirements are stringent, particularly with regard to precision, sensitivity to small changes, and temporal sampling rate. In addition, many aircraft have limited payload capability, so instrument size, weight, and power consumption are also important considerations. We report on the development and preliminary application of an airborne sensor for the measurement of atmospheric CO2. The instrument, modeled on the successful DLH (Diode Laser Hygrometer) series of instruments, has been tested in the laboratory and on the NASA DC-8 aircraft. Performance parameters such as accuracy, precision, sensitivity, specificity, and temporal response are discussed in the context of typical atmospheric variability and suitability for flux measurement applications. On-aircraft, in-flight data have been obtained and are discussed as well. Performance of the instrument has been promising, and continued flight testing is planned during 2016.

  13. Collateral Information for Equating in Small Samples: A Preliminary Investigation

    Science.gov (United States)

    Kim, Sooyeon; Livingston, Samuel A.; Lewis, Charles

    2011-01-01

    This article describes a preliminary investigation of an empirical Bayes (EB) procedure for using collateral information to improve equating of scores on test forms taken by small numbers of examinees. Resampling studies were done on two different forms of the same test. In each study, EB and non-EB versions of two equating methods--chained linear…

  14. Role of grain refinement in hardening of structural steels at preliminary thermomechanical treatment

    International Nuclear Information System (INIS)

    Bukhvalov, A.B.; Grigor'eva, E.V.; Davydova, L.S.; Degtyarev, M.V.; Levit, V.I.; Smirnova, N.A.; Smirnov, L.V.

    1981-01-01

    The hardening mechanism during preliminary thermomechanical treatment with deformation by cold rolling or hydroextrusion is studied on structural 37KhN3M1 and 38KhN3MFA steels. Specimens have been tested on static tension, impact strength and fracture toughness. It is shown that hydroextrusion application instead of rolling does not change the hardening effect of preliminary thermomechanical treatment (PTMT). It is established that the increase of preliminary deformation degree and the use of accelerated short term hardening heating provides a bett er grain refinement and the increase of PTMT hardening effect [ru

  15. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  16. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  17. Probability of criminal acts of violence: a test of jury predictive accuracy.

    Science.gov (United States)

    Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D

    2013-01-01

    The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  19. Bremsstrahlung emission probability in the α decay of 210Po

    International Nuclear Information System (INIS)

    Boie, Hans-Hermann

    2009-01-01

    A high-statistics measurement of bremsstrahlung emitted in the α decay of 210 Po has been performed. The measured differential emission probabilities, which could be followed up to γ-energies of ∝ 500 keV, allow for the first time for a serious test of various model calculations of the bremsstrahlung accompanied α decay. It is shown that corrections to the α-γ angular correlation due to the interference between the electric dipole and quadrupole amplitudes and due to the relativistic character of the process have to be taken into account. With the experimentally derived angular correlation the measured energydifferential bremsstrahlung emission probabilities show excellent agreement with the fully quantum mechanical calculation. (orig.)

  20. Geoscientific long-term prognosis. Preliminary safety analysis for the site Gorleben

    International Nuclear Information System (INIS)

    Mrugalla, Sabine

    2011-07-01

    The preliminary safety analysis of the site Gorleben includes the following chapters: (1) Introduction; (2) Aim and content of the geoscientific long-term prognosis for the site Gorleben; (3) Boundary conditions at the site Gorleben: climate; geomorphology; overlying rocks and adjoining rocks; hydrogeology; salt deposit Gorleben. (4) Probable future geological developments at the site Gorleben: supraregional developments with effects on the site Gorleben; glacial period developments; developments of the geomorphology, overlying and adjoining rocks; future developments of the hydrological systems at the site Gorleben; future saliniferous specific developments of the salt deposit Gorleben. (5) Commentary on the unlikely or excludable developments of the site Gorleben.

  1. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  2. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  3. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  4. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  5. Preliminary Sensorimotor and Cardiovascular Results from the Joint Russian/U.S. Pilot Field Test in Preparation for the Full Field Test

    Science.gov (United States)

    Reschke, M. F.; Kozlovskaya, I. B.; Tomilovskaya, E. S.; Bloomberg, J. J.; Platts, S. H.; Rukavishnikov, I. V.; Fomina, E. V.; Stenger, M. B.; Lee, S. M. C.; Wood, S. J.; hide

    2014-01-01

    Ongoing collaborative research efforts between NASA's Neuroscience and Cardiovascular Laboratories, and the Institute of Biomedical Problems' (IBMP) Sensory-Motor and Countermeasures Laboratories have been measuring functional sensorimotor, cardiovascular and strength responses following bed rest, dry immersion, short-duration (Space Shuttle) and long-duration (Mir and International Space Station [ISS]) space flights. While the unloading paradigms associated with dry immersion and bed rest does serve as acceptable flight analogs, testing of crew responses following the long-duration flights previously has not been possible until a minimum of 24 hours after landing. As a result, it is not possible to estimate the nonlinear trend of the early (testing at the landing site. By joint agreement, this research effort has been identified as the functional Field Test (FT). For practical reasons the FT has been divided into two phases: the full FT and a preliminary pilot version (PFT) of the FT that is reduced in both length and scope. The primary goal of this research is to determine functional abilities in long-duration space-flight crews beginning as soon after landing as possible (test in conjunction with postural ataxia testing (quiet stance sway) as well as cardiovascular responses during the other functional tasks. In addition to the immediate post-landing collection of data for the full FT, postflight data will be acquired between one and three more other times within the 24 hours after landing and will continue over the subsequent weeks until functional sensorimotor and cardiovascular responses have returned to preflight normative values. The PFT represents a single trial run comprised of a jointly agreed upon subset of tests from the full FT and relies heavily on IBMP's Sensory-Motor and Countermeasures Laboratories for content and implementation. The PFT has been collected on several ISS missions. Testing included: (1) a sit-to-stand test, (2) recovery from a fall

  6. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  7. SP-100 Test Site

    International Nuclear Information System (INIS)

    Cox, C.M.; Mahaffey, M.K.; Miller, W.C.

    1988-01-01

    Preparatory activities are well under way at Hanford to convert the 309 Containment Building and its associated service wing to a 2.5 MWt nuclear test facility for the SP-100 Ground Engineering System (GES) test. Preliminary design is complete, encompassing facility modifications, a secondary heat transport system, a large vacuum system to enclose the high temperature reactor, a test assembly cell and handling system, control and data processing systems, and safety and auxiliary systems. The design makes extensive use of existing equipment to minimize technical risk and cost. Refurbishment of this equipment is 75% complete. The facility has been cleared of obstructing equipment from its earlier reactor test. Current activities are focusing on definitive design and preparation of the Preliminary Safety Analysis Report (PSAR) aimed at procurement and construction approvals and schedules to achieve reactor criticality by January 1992. 6 refs

  8. Assessment of clinical utility of 18F-FDG PET in patients with head and neck cancer: a probability analysis

    International Nuclear Information System (INIS)

    Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von; Steurer, Johann; Bachmann, Lucas M.

    2003-01-01

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  9. Preliminary nuclear design for test MOX Fuel rods

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Hyung Kook; Kim, Taek Kyum; Jeong, Hyung Guk; Noh, Jae Man; Cho, Jin Young; Kim, Young Il; Kim, Young Jin; Sohn, Dong Seong

    1997-10-01

    As a part of activity for future fuel development project, test MOX fuel rods are going to be loaded and irradiated in Halden reactor core as a KAERI`s joint international program with Paul Scherrer Institute (PSI). PSI will fabricate test MOX rods with attrition mill device which was developed by KAERI. The test fuel assembly rig contains three MOX rods and three inert matrix rods. One of three MOX rods will be fabricated by BNFL, the other two MOX fuel rods will be manufacturing jointly by KAERI and PSI. Three inert matrix fuel rods will be fabricated with Zr-Y-Er-Pu oxide. Neutronic evaluation was preliminarily performed for test fuel assembly suggested by PSI. The power distribution of test fuel rod in test fuel assembly was analyzed for various fuel rods position in assembly and the depletion characteristic curve for test fuel was also determined. The fuel rods position in test fuel assembly does not effect the rod power distribution, and the proposal for test fuel rods suggested by PSI is proved to be feasible. (author). 2 refs., 13 tabs., 16 figs.

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. Sample Size Determination for Rasch Model Tests

    Science.gov (United States)

    Draxler, Clemens

    2010-01-01

    This paper is concerned with supplementing statistical tests for the Rasch model so that additionally to the probability of the error of the first kind (Type I probability) the probability of the error of the second kind (Type II probability) can be controlled at a predetermined level by basing the test on the appropriate number of observations.…

  13. A preliminary bending fatigue spectrum for steel monostrand cables

    DEFF Research Database (Denmark)

    Winkler, Jan; Fischer, Gregor; Georgakis, Christos T.

    2011-01-01

    This paper presents the results of the experimental study on the bending fatigue resistance of high-strength steel monostrand cables. From the conducted fatigue tests in the high-stress, low-cycle region, a preliminary bending fatigue spectrum is derived for the estimation of monostrand cable...... service life expectancy. The presented preliminary bending fatigue spectrum of high-strength monostrands is currently unavailable in the published literature. The presented results provide relevant information on the bending mechanism and fatigue characteristics of monostrand steel cables in tension...... and flexure and show that localized cable bending has a pronounced influence on the fatigue resistance of cables under dynamic excitations....

  14. Soda-Anthraquinone Durian (Durio Zibethinus Murr.) Rind Linerboard and Corrugated Medium Paper: A Preliminary Test

    Science.gov (United States)

    Rizal Masrol, Shaiful; Irwan Ibrahim, Mohd Halim; Adnan, Sharmiza; Mubarak Sa'adon, Amir; Ika Sukarno, Khairil; Fadrol Hisham Yusoff, Mohd

    2017-08-01

    A preliminary test was conducted to investigate the characteristics of linerboard and corrugated medium paper made from durian rind waste. Naturally dried durian rinds were pulped according to Soda-Anthraquinone (Soda-AQ) pulping process with a condition of 20% active alkali, 0.1% AQ, 7:1 liquor to material ratio, 120 minutes cooking time and 170°C cooking temperature. The linerboard and corrugated medium paper with a basis weight of 120 gsm were prepared and evaluated according to Malaysian International Organization for Standardization (MS ISO) and Technical Association of the Pulp and Paper Industry (TAPPI). The results indicate that the characteristics of durian rind linerboard are comparable with other wood or non-wood based paper and current commercial paper. However, low CMT value for corrugated medium and water absorptiveness quality for linerboard could be improved in future. Based on the bulk density (0.672 g/cm3), burst index (3.12 kPa.m2/g) and RCT (2.00 N.m2/g), the durian rind has shown a good potential and suitable as an alternative raw material source for linerboard industry.

  15. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  16. Effect of preliminary plastic deformation on low temperature strength of carbon steels

    International Nuclear Information System (INIS)

    Gur'ev, A.V.; Alkhimenkov, T.B.

    1979-01-01

    Considered is the effect of preliminary plastic deformation on the following low-temperature strength (at -196 deg C) of structural carbon steels at the room temperature. The study of regularities of microheterogenetic deformations by alloy structure elements at room and low temperatures shows that the transition on low -temperature loading is built on the base of inheritance of the general mechanism of plastic deformation, which took place at preliminary deformation; in this effect the ''memory'' of metal to the history of loading is shown. It is established that physical strengthening (cold hardening), received by the metal during preliminary loading at the room temperature is put over the strengthening connected only with decrease of test temperature

  17. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  18. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  19. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  20. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.