WorldWideScience

Sample records for stratified probability sample

  1. Gambling problems in the family – A stratified probability sample study of prevalence and reported consequences

    Directory of Open Access Journals (Sweden)

    Øren Anita

    2008-12-01

    Full Text Available Abstract Background Prior studies on the impact of problem gambling in the family mainly include help-seeking populations with small numbers of participants. The objective of the present stratified probability sample study was to explore the epidemiology of problem gambling in the family in the general population. Methods Men and women 16–74 years-old randomly selected from the Norwegian national population database received an invitation to participate in this postal questionnaire study. The response rate was 36.1% (3,483/9,638. Given the lack of validated criteria, two survey questions ("Have you ever noticed that a close relative spent more and more money on gambling?" and "Have you ever experienced that a close relative lied to you about how much he/she gambles?" were extrapolated from the Lie/Bet Screen for pathological gambling. Respondents answering "yes" to both questions were defined as Concerned Significant Others (CSOs. Results Overall, 2.0% of the study population was defined as CSOs. Young age, female gender, and divorced marital status were factors positively associated with being a CSO. CSOs often reported to have experienced conflicts in the family related to gambling, worsening of the family's financial situation, and impaired mental and physical health. Conclusion Problematic gambling behaviour not only affects the gambling individual but also has a strong impact on the quality of life of family members.

  2. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  3. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  4. Monte Carlo stratified source-sampling

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Gelbard, E.M.

    1997-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo open-quotes eigenvalue of the worldclose quotes problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress

  5. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  6. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  7. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  8. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  9. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  10. Temporally stratified sampling programs for estimation of fish impingement

    International Nuclear Information System (INIS)

    Kumar, K.D.; Griffith, J.S.

    1977-01-01

    Impingement monitoring programs often expend valuable and limited resources and fail to provide a dependable estimate of either total annual impingement or those biological and physicochemical factors affecting impingement. In situations where initial monitoring has identified ''problem'' fish species and the periodicity of their impingement, intensive sampling during periods of high impingement will maximize information obtained. We use data gathered at two nuclear generating facilities in the southeastern United States to discuss techniques of designing such temporally stratified monitoring programs and their benefits and drawbacks. Of the possible temporal patterns in environmental factors within a calendar year, differences among seasons are most influential in the impingement of freshwater fishes in the Southeast. Data on the threadfin shad (Dorosoma petenense) and the role of seasonal temperature changes are utilized as an example to demonstrate ways of most efficiently and accurately estimating impingement of the species

  11. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  12. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  13. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  14. Prototypic Features of Loneliness in a Stratified Sample of Adolescents

    Directory of Open Access Journals (Sweden)

    Mathias Lasgaard

    2009-06-01

    Full Text Available Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics. A questionnaire survey was conducted with 379 Danish Grade 8 students (M = 14.1 years, SD = 0.4 from 22 geographically stratified and randomly selected schools. Hierarchical linear regression analysis showed that network orientation, success expectation and avoidance in affiliative situations predicted loneliness independent of personality characteristics, demographics and social desirability. The study indicates that dysfunctional strategies and attributions in affiliative situations are directly related to loneliness in adolescence. These strategies and attributions may preclude lonely adolescents from guidance and intervention. Thus, professionals need to be knowledgeable about prototypic features of loneliness in addition to employing a pro-active approach when assisting adolescents who display prototypic features.

  15. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  16. Unit Stratified Sampling as a Tool for Approximation of Stochastic Optimization Problems

    Czech Academy of Sciences Publication Activity Database

    Šmíd, Martin

    2012-01-01

    Roč. 19, č. 30 (2012), s. 153-169 ISSN 1212-074X R&D Projects: GA ČR GAP402/11/0150; GA ČR GAP402/10/0956; GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Stochastic programming * approximation * stratified sampling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/smid-unit stratified sampling as a tool for approximation of stochastic optimization problems.pdf

  17. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  18. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  19. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  20. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  1. Data splitting for artificial neural networks using SOM-based stratified sampling.

    Science.gov (United States)

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  2. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    Science.gov (United States)

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based,…

  3. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  4. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  5. Bounds for Tail Probabilities of the Sample Variance

    Directory of Open Access Journals (Sweden)

    Van Zuijlen M

    2009-01-01

    Full Text Available We provide bounds for tail probabilities of the sample variance. The bounds are expressed in terms of Hoeffding functions and are the sharpest known. They are designed having in mind applications in auditing as well as in processing data related to environment.

  6. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  7. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  8. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  9. Employment status, inflation and suicidal behaviour: an analysis of a stratified sample in Italy.

    Science.gov (United States)

    Solano, Paola; Pizzorno, Enrico; Gallina, Anna M; Mattei, Chiara; Gabrielli, Filippo; Kayman, Joshua

    2012-09-01

    There is abundant empirical evidence of a surplus risk of suicide among the unemployed, although few studies have investigated the influence of economic downturns on suicidal behaviours in an employment status-stratified sample. We investigated how economic inflation affected suicidal behaviours according to employment status in Italy from 2001 to 2008. Data concerning economically active people were provided by the Italian Institute for Statistical Analysis and by the International Monetary Fund. The association between inflation and completed versus attempted suicide with respect to employment status was investigated in every year and quarter-year of the study time frame. We considered three occupational categories: employed, unemployed who were previously employed and unemployed who had never worked. The unemployed are at higher suicide risk than the employed. Among the PE, a significant association between inflation and suicide attempt was found, whereas no association was reported concerning completed suicides. No association was found between completed and attempted suicides among the employed, the NE and inflation. Completed suicide in females is significantly associated with unemployment in every quarter-year. The reported vulnerability to suicidal behaviours among the PE as inflation rises underlines the need of effective support strategies for both genders in times of economic downturns.

  10. Effect of optimum stratification on sampling with varying probabilities under proportional allocation

    Directory of Open Access Journals (Sweden)

    Syed Ejaz Husain Rizvi

    2007-10-01

    Full Text Available The problem of optimum stratification on an auxiliary variable when the units from different strata are selected with probability proportional to the value of auxiliary variable (PPSWR was considered by Singh (1975 for univariate case. In this paper we have extended the same problem, for proportional allocation, when two variates are under study. A cum. 3 R3(x rule for obtaining approximately optimum strata boundaries has been provided. It has been shown theoretically as well as empirically that the use of stratification has inverse effect on the relative efficiency of PPSWR as compared to unstratified PPSWR method when proportional method of allocation is envisaged. Further comparison showed that with increase in number of strata the stratified simple random sampling is equally efficient as PPSWR.

  11. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  12. Improving Precision and Reducing Runtime of Microscopic Traffic Simulators through Stratified Sampling

    Directory of Open Access Journals (Sweden)

    Khewal Bhupendra Kesur

    2013-01-01

    Full Text Available This paper examines the application of Latin Hypercube Sampling (LHS and Antithetic Variables (AVs to reduce the variance of estimated performance measures from microscopic traffic simulators. LHS and AV allow for a more representative coverage of input probability distributions through stratification, reducing the standard error of simulation outputs. Two methods of implementation are examined, one where stratification is applied to headways and routing decisions of individual vehicles and another where vehicle counts and entry times are more evenly sampled. The proposed methods have wider applicability in general queuing systems. LHS is found to outperform AV, and reductions of up to 71% in the standard error of estimates of traffic network performance relative to independent sampling are obtained. LHS allows for a reduction in the execution time of computationally expensive microscopic traffic simulators as fewer simulations are required to achieve a fixed level of precision with reductions of up to 84% in computing time noted on the test cases considered. The benefits of LHS are amplified for more congested networks and as the required level of precision increases.

  13. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  14. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  15. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    Science.gov (United States)

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  16. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    Science.gov (United States)

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  17. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  18. Stratified Sampling to Define Levels of Petrographic Variation in Coal Beds: Examples from Indonesia and New Zealand

    Directory of Open Access Journals (Sweden)

    Tim A. Moore

    2016-01-01

    Full Text Available DOI: 10.17014/ijog.3.1.29-51Stratified sampling of coal seams for petrographic analysis using block samples is a viable alternative to standard methods of channel sampling and particulate pellet mounts. Although petrographic analysis of particulate pellets is employed widely, it is both time consuming and does not allow variation within sampling units to be assessed - an important measure in any study whether it be for paleoenvironmental reconstruction or in obtaining estimates of industrial attributes. Also, samples taken as intact blocks provide additional information, such as texture and botanical affinity that cannot be gained using particulate pellets. Stratified sampling can be employed both on ‘fine’ and ‘coarse’ grained coal units. Fine-grained coals are defined as those coal intervals that do not contain vitrain bands greater than approximately 1 mm in thickness (as measured perpendicular to bedding. In fine-grained coal seams, a reasonable sized block sample (with a polished surface area of ~3 cm2 can be taken that encapsulates the macroscopic variability. However, for coarse-grained coals (vitrain bands >1 mm a different system has to be employed in order to accurately account for the larger particles. Macroscopic point counting of vitrain bands can accurately account for those particles>1 mm within a coal interval. This point counting method is conducted using something as simple as string on a coal face with marked intervals greater than the largest particle expected to be encountered (although new technologies are being developed to capture this type of information digitally. Comparative analyses of particulate pellets and blocks on the same interval show less than 6% variation between the two sample types when blocks are recalculated to include macroscopic counts of vitrain. Therefore even in coarse-grained coals, stratified sampling can be used effectively and representatively.

  19. Interactive Fuzzy Goal Programming approach in multi-response stratified sample surveys

    Directory of Open Access Journals (Sweden)

    Gupta Neha

    2016-01-01

    Full Text Available In this paper, we applied an Interactive Fuzzy Goal Programming (IFGP approach with linear, exponential and hyperbolic membership functions, which focuses on maximizing the minimum membership values to determine the preferred compromise solution for the multi-response stratified surveys problem, formulated as a Multi- Objective Non Linear Programming Problem (MONLPP, and by linearizing the nonlinear objective functions at their individual optimum solution, the problem is approximated to an Integer Linear Programming Problem (ILPP. A numerical example based on real data is given, and comparison with some existing allocations viz. Cochran’s compromise allocation, Chatterjee’s compromise allocation and Khowaja’s compromise allocation is made to demonstrate the utility of the approach.

  20. Discrete element method (DEM) simulations of stratified sampling during solid dosage form manufacturing.

    Science.gov (United States)

    Hancock, Bruno C; Ketterhagen, William R

    2011-10-14

    Discrete element model (DEM) simulations of the discharge of powders from hoppers under gravity were analyzed to provide estimates of dosage form content uniformity during the manufacture of solid dosage forms (tablets and capsules). For a system that exhibits moderate segregation the effects of sample size, number, and location within the batch were determined. The various sampling approaches were compared to current best-practices for sampling described in the Product Quality Research Institute (PQRI) Blend Uniformity Working Group (BUWG) guidelines. Sampling uniformly across the discharge process gave the most accurate results with respect to identifying segregation trends. Sigmoidal sampling (as recommended in the PQRI BUWG guidelines) tended to overestimate potential segregation issues, whereas truncated sampling (common in industrial practice) tended to underestimate them. The size of the sample had a major effect on the absolute potency RSD. The number of sampling locations (10 vs. 20) had very little effect on the trends in the data, and the number of samples analyzed at each location (1 vs. 3 vs. 7) had only a small effect for the sampling conditions examined. The results of this work provide greater understanding of the effect of different sampling approaches on the measured content uniformity of real dosage forms, and can help to guide the choice of appropriate sampling protocols. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Probability Sampling - A Guideline for Quantitative Health Care ...

    African Journals Online (AJOL)

    A more direct definition is the method used for selecting a given ... description of the chosen population, the sampling procedure giving ... target population, precision, and stratification. The ... survey estimates, it is recommended that researchers first analyze a .... The optimum sample size has a relation to the type of planned ...

  2. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  3. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  4. Importance sampling for failure probabilities in computing and data transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    2009-01-01

    In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description...... of the conditional distribution of T given X›x. If T≡t is constant, the problem reduces to the efficient simulation of geometric sums, and a standard algorithm involving a Cramér-type root, γ(t), is available. However, we also discuss an algorithm that avoids finding the root. If T is random, particular attention...... is given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different types of conditional limit occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using...

  5. Sampling high-altitude and stratified mating flights of red imported fire ant.

    Science.gov (United States)

    Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K

    2011-05-01

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.

  6. Importance Sampling for Failure Probabilities in Computing and Data Transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description...... of the conditional distribution of T given Χ > χ. If T ≡ t is constant, the problem reduces to the efficient simulation of geometric sums, and a standard algorithm involving a Cramér type root  γ(t) is available. However, we also discuss  an algorithm avoiding the rootfinding. If T is random, particular attention...... is given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different type of conditional limits occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using...

  7. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    Science.gov (United States)

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  8. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  9. Stratifying empiric risk of schizophrenia among first degree relatives using multiple predictors in two independent Indian samples.

    Science.gov (United States)

    Bhatia, Triptish; Gettig, Elizabeth A; Gottesman, Irving I; Berliner, Jonathan; Mishra, N N; Nimgaonkar, Vishwajit L; Deshpande, Smita N

    2016-12-01

    Schizophrenia (SZ) has an estimated heritability of 64-88%, with the higher values based on twin studies. Conventionally, family history of psychosis is the best individual-level predictor of risk, but reliable risk estimates are unavailable for Indian populations. Genetic, environmental, and epigenetic factors are equally important and should be considered when predicting risk in 'at risk' individuals. To estimate risk based on an Indian schizophrenia participant's family history combined with selected demographic factors. To incorporate variables in addition to family history, and to stratify risk, we constructed a regression equation that included demographic variables in addition to family history. The equation was tested in two independent Indian samples: (i) an initial sample of SZ participants (N=128) with one sibling or offspring; (ii) a second, independent sample consisting of multiply affected families (N=138 families, with two or more sibs/offspring affected with SZ). The overall estimated risk was 4.31±0.27 (mean±standard deviation). There were 19 (14.8%) individuals in the high risk group, 75 (58.6%) in the moderate risk and 34 (26.6%) in the above average risk (in Sample A). In the validation sample, risks were distributed as: high (45%), moderate (38%) and above average (17%). Consistent risk estimates were obtained from both samples using the regression equation. Familial risk can be combined with demographic factors to estimate risk for SZ in India. If replicated, the proposed stratification of risk may be easier and more realistic for family members. Copyright © 2016. Published by Elsevier B.V.

  10. Development and enrolee satisfaction with basic medical insurance in China: A systematic review and stratified cluster sampling survey.

    Science.gov (United States)

    Jing, Limei; Chen, Ru; Jing, Lisa; Qiao, Yun; Lou, Jiquan; Xu, Jing; Wang, Junwei; Chen, Wen; Sun, Xiaoming

    2017-07-01

    Basic Medical Insurance (BMI) has changed remarkably over time in China because of health reforms that aim to achieve universal coverage and better health care with adequate efforts by increasing subsidies, reimbursement, and benefits. In this paper, we present the development of BMI, including financing and operation, with a systematic review. Meanwhile, Pudong New Area in Shanghai was chosen as a typical BMI sample for its coverage and management; a stratified cluster sampling survey together with an ordinary logistic regression model was used for the analysis. Enrolee satisfaction and the factors associated with enrolee satisfaction with BMI were analysed. We found that the reenrolling rate superficially improved the BMI coverage and nearly achieved universal coverage. However, BMI funds still faced dual contradictions of fund deficit and insured under compensation, and a long-term strategy is needed to realize the integration of BMI schemes with more homogeneous coverage and benefits. Moreover, Urban Resident Basic Medical Insurance participants reported a higher rate of dissatisfaction than other participants. The key predictors of the enrolees' satisfaction were awareness of the premium and compensation, affordability of out-of-pocket costs, and the proportion of reimbursement. These results highlight the importance that the Chinese government takes measures, such as strengthening BMI fund management, exploring mixed payment methods, and regulating sequential medical orders, to develop an integrated medical insurance system of universal coverage and vertical equity while simultaneously improving enrolee satisfaction. Copyright © 2017 John Wiley & Sons, Ltd.

  11. On the assessment of extremely low breakdown probabilities by an inverse sampling procedure [gaseous insulation

    DEFF Research Database (Denmark)

    Thyregod, Poul; Vibholm, Svend

    1991-01-01

    the flashover probability function and the corresponding distribution of first breakdown voltages under the inverse sampling procedure, and show how this relation may be utilized to assess the single-shot flashover probability corresponding to the observed average first breakdown voltage. Since the procedure......First breakdown voltages obtained under the inverse sampling procedure assuming a double exponential flashover probability function are discussed. An inverse sampling procedure commences the voltage application at a very low level, followed by applications at stepwise increased levels until...... is based on voltage applications in the neighbourhood of the quantile under investigation, the procedure is found to be insensitive to the underlying distributional assumptions...

  12. PREDOMINANTLY LOW METALLICITIES MEASURED IN A STRATIFIED SAMPLE OF LYMAN LIMIT SYSTEMS AT Z  = 3.7

    Energy Technology Data Exchange (ETDEWEB)

    Glidden, Ana; Cooper, Thomas J.; Simcoe, Robert A. [Massachusetts Institute of Technology, 77 Massachusetts Ave, Cambridge, MA 02139 (United States); Cooksey, Kathy L. [Department of Physics and Astronomy, University of Hawai‘i at Hilo, 200 West Kāwili Street, Hilo, HI 96720 (United States); O’Meara, John M., E-mail: aglidden@mit.edu, E-mail: tjcooper@mit.edu, E-mail: simcoe@space.mit.edu, E-mail: kcooksey@hawaii.edu, E-mail: jomeara@smcvt.edu [Department of Physics, Saint Michael’s College, One Winooski Park, Colchester, VT 05439 (United States)

    2016-12-20

    We measured metallicities for 33 z = 3.4–4.2 absorption line systems drawn from a sample of H i-selected-Lyman limit systems (LLSs) identified in Sloan Digital Sky Survey (SDSS) quasar spectra and stratified based on metal line features. We obtained higher-resolution spectra with the Keck Echellette Spectrograph and Imager, selecting targets according to our stratification scheme in an effort to fully sample the LLS population metallicity distribution. We established a plausible range of H i column densities and measured column densities (or limits) for ions of carbon, silicon, and aluminum, finding ionization-corrected metallicities or upper limits. Interestingly, our ionization models were better constrained with enhanced α -to-aluminum abundances, with a median abundance ratio of [ α /Al] = 0.3. Measured metallicities were generally low, ranging from [M/H] = −3 to −1.68, with even lower metallicities likely for some systems with upper limits. Using survival statistics to incorporate limits, we constructed the cumulative distribution function (CDF) for LLS metallicities. Recent models of galaxy evolution propose that galaxies replenish their gas from the low-metallicity intergalactic medium (IGM) via high-density H i “flows” and eject enriched interstellar gas via outflows. Thus, there has been some expectation that LLSs at the peak of cosmic star formation ( z  ≈ 3) might have a bimodal metallicity distribution. We modeled our CDF as a mix of two Gaussian distributions, one reflecting the metallicity of the IGM and the other representative of the interstellar medium of star-forming galaxies. This bimodal distribution yielded a poor fit. A single Gaussian distribution better represented the sample with a low mean metallicity of [M/H] ≈ −2.5.

  13. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    NARCIS (Netherlands)

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  14. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  15. Executive control resources and frequency of fatty food consumption: findings from an age-stratified community sample.

    Science.gov (United States)

    Hall, Peter A

    2012-03-01

    Fatty foods are regarded as highly appetitive, and self-control is often required to resist consumption. Executive control resources (ECRs) are potentially facilitative of self-control efforts, and therefore could predict success in the domain of dietary self-restraint. It is not currently known whether stronger ECRs facilitate resistance to fatty food consumption, and moreover, it is unknown whether such an effect would be stronger in some age groups than others. The purpose of the present study was to examine the association between ECRs and consumption of fatty foods among healthy community-dwelling adults across the adult life span. An age-stratified sample of individuals between 18 and 89 years of age attended two laboratory sessions. During the first session they completed two computer-administered tests of ECRs (Stroop and Go-NoGo) and a test of general cognitive function (Wechsler Abbreviated Scale of Intelligence); participants completed two consecutive 1-week recall measures to assess frequency of fatty and nonfatty food consumption. Regression analyses revealed that stronger ECRs were associated with lower frequency of fatty food consumption over the 2-week interval. This association was observed for both measures of ECR and a composite measure. The effect remained significant after adjustment for demographic variables (age, gender, socioeconomic status), general cognitive function, and body mass index. The observed effect of ECRs on fatty food consumption frequency was invariant across age group, and did not generalize to nonfatty food consumption. ECRs may be potentially important, though understudied, determinants of dietary behavior in adults across the life span.

  16. Prevalence and Risk Factors of Dengue Infection in Khanh Hoa Province, Viet Nam: A Stratified Cluster Sampling Survey.

    Science.gov (United States)

    Mai, Vien Quang; Mai, Trịnh Thị Xuan; Tam, Ngo Le Minh; Nghia, Le Trung; Komada, Kenichi; Murakami, Hitoshi

    2018-05-19

    Dengue is a clinically important arthropod-borne viral disease with increasing global incidence. Here we aimed to estimate the prevalence of dengue infections in Khanh Hoa Province, central Viet Nam, and to identify risk factors for infection. We performed a stratified cluster sampling survey including residents of 3-60 years of age in Nha Trang City, Ninh Hoa District and Dien Khanh District, Khanh Hoa Province, in October 2011. Immunoglobulin G (IgG) and immunoglobulin M (IgM) against dengue were analyzed using a rapid test kit. Participants completed a questionnaire exploring clinical dengue incidence, socio-economic status, and individual behavior. A household checklist was used to examine environment, mosquito larvae presence, and exposure to public health interventions. IgG positivity was 20.5% (urban, 16.3%; rural, 23.0%), IgM positivity was 6.7% (urban, 6.4%; rural, 6.9%), and incidence of clinically compatible dengue during the prior 3 months was 2.8 per 1,000 persons (urban, 1.7; rural, 3.4). For IgG positivity, the adjusted odds ratio (AOR) was 2.68 (95% confidence interval [CI], 1.24-5.81) for mosquito larvae presence in water pooled in old tires and was 3.09 (95% CI, 1.75-5.46) for proximity to a densely inhabited area. For IgM positivity, the AOR was 3.06 (95% CI, 1.50-6.23) for proximity to a densely inhabited area. Our results indicated rural penetration of dengue infections. Control measures should target densely inhabited areas, and may include clean-up of discarded tires and water-collecting waste.

  17. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  18. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    Science.gov (United States)

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  19. Calculation of parameter failure probability of thermodynamic system by response surface and importance sampling method

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei

    2012-01-01

    In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)

  20. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  1. An optimized Line Sampling method for the estimation of the failure probability of nuclear passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2010-01-01

    The quantitative reliability assessment of a thermal-hydraulic (T-H) passive safety system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. In this work, Line Sampling (LS) is adopted for efficient MC sampling. In the LS method, an 'important direction' pointing towards the failure domain of interest is determined and a number of conditional one-dimensional problems are solved along such direction; this allows for a significant reduction of the variance of the failure probability estimator, with respect, for example, to standard random sampling. Two issues are still open with respect to LS: first, the method relies on the determination of the 'important direction', which requires additional runs of the T-H code; second, although the method has been shown to improve the computational efficiency by reducing the variance of the failure probability estimator, no evidence has been given yet that accurate and precise failure probability estimates can be obtained with a number of samples reduced to below a few hundreds, which may be required in case of long-running models. The work presented in this paper addresses the first issue by (i) quantitatively comparing the efficiency of the methods proposed in the literature to determine the LS important direction; (ii) employing artificial neural network (ANN) regression models as fast-running surrogates of the original, long-running T-H code to reduce the computational cost associated to the

  2. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  3. A country-wide probability sample of public attitudes toward stuttering in Portugal.

    Science.gov (United States)

    Valente, Ana Rita S; St Louis, Kenneth O; Leahy, Margaret; Hall, Andreia; Jesus, Luis M T

    2017-06-01

    Negative public attitudes toward stuttering have been widely reported, although differences among countries and regions exist. Clear reasons for these differences remain obscure. Published research is unavailable on public attitudes toward stuttering in Portugal as well as a representative sample that explores stuttering attitudes in an entire country. This study sought to (a) determine the feasibility of a country-wide probability sampling scheme to measure public stuttering attitudes in Portugal using a standard instrument (the Public Opinion Survey of Human Attributes-Stuttering [POSHA-S]) and (b) identify demographic variables that predict Portuguese attitudes. The POSHA-S was translated to European Portuguese through a five-step process. Thereafter, a local administrative office-based, three-stage, cluster, probability sampling scheme was carried out to obtain 311 adult respondents who filled out the questionnaire. The Portuguese population held stuttering attitudes that were generally within the average range of those observed from numerous previous POSHA-S samples. Demographic variables that predicted more versus less positive stuttering attitudes were respondents' age, region of the country, years of school completed, working situation, and number of languages spoken. Non-predicting variables were respondents' sex, marital status, and parental status. A local administrative office-based, probability sampling scheme generated a respondent profile similar to census data and indicated that Portuguese attitudes are generally typical. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  5. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  6. The relationships between sixteen perfluorinated compound concentrations in blood serum and food, and other parameters, in the general population of South Korea with proportionate stratified sampling method.

    Science.gov (United States)

    Kim, Hee-Young; Kim, Seung-Kyu; Kang, Dong-Mug; Hwang, Yong-Sik; Oh, Jeong-Eun

    2014-02-01

    Serum samples were collected from volunteers of various ages and both genders using a proportionate stratified sampling method, to assess the exposure of the general population in Busan, South Korea to perfluorinated compounds (PFCs). 16 PFCs were investigated in serum samples from 306 adults (124 males and 182 females) and one day composite diet samples (breakfast, lunch, and dinner) from 20 of the serum donors, to investigate the relationship between food and serum PFC concentrations. Perfluorooctanoic acid and perfluorooctanesulfonic acid were the dominant PFCs in the serum samples, with mean concentrations of 8.4 and 13 ng/mL, respectively. Perfluorotridecanoic acid was the dominant PFC in the composite food samples, ranging from studies. We confirmed from the relationships between questionnaire results and the PFC concentrations in the serum samples, that food is one of the important contribution factors of human exposure to PFCs. However, there were no correlations between the PFC concentrations in the one day composite diet samples and the serum samples, because a one day composite diet sample is not necessarily representative of a person's long-term diet and because of the small number of samples taken. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  8. How serious a problem is subsoil compaction in the Netherlands? A survey based on probability sampling

    Science.gov (United States)

    Brus, Dick J.; van den Akker, Jan J. H.

    2018-02-01

    Although soil compaction is widely recognized as a soil threat to soil resources, reliable estimates of the acreage of overcompacted soil and of the level of soil compaction parameters are not available. In the Netherlands data on subsoil compaction were collected at 128 locations selected by stratified random sampling. A map showing the risk of subsoil compaction in five classes was used for stratification. Measurements of bulk density, porosity, clay content and organic matter content were used to compute the relative bulk density and relative porosity, both expressed as a fraction of a threshold value. A subsoil was classified as overcompacted if either the relative bulk density exceeded 1 or the relative porosity was below 1. The sample data were used to estimate the means of the two subsoil compaction parameters and the overcompacted areal fraction. The estimated global means of relative bulk density and relative porosity were 0.946 and 1.090, respectively. The estimated areal fraction of the Netherlands with overcompacted subsoils was 43 %. The estimates per risk map unit showed two groups of map units: a low-risk group (units 1 and 2, covering only 4.6 % of the total area) and a high-risk group (units 3, 4 and 5). The estimated areal fraction of overcompacted subsoil was 0 % in the low-risk unit and 47 % in the high-risk unit. The map contains no information about where overcompacted subsoils occur. This was caused by the poor association of the risk map units 3, 4 and 5 with the subsoil compaction parameters and subsoil overcompaction. This can be explained by the lack of time for recuperation.

  9. A probable risk factor of female breast cancer: study on benign and malignant breast tissue samples.

    Science.gov (United States)

    Rehman, Sohaila; Husnain, Syed M

    2014-01-01

    The study reports enhanced Fe, Cu, and Zn contents in breast tissues, a probable risk factor of breast cancer in females. Forty-one formalin-fixed breast tissues were analyzed using atomic absorption spectrophotometry. Twenty malignant, six adjacent to malignant and 15 benign tissues samples were investigated. The malignant tissues samples were of grade 11 and type invasive ductal carcinoma. The quantitative comparison between the elemental levels measured in the two types of specimen (benign and malignant) tissues (removed after surgery) suggests significant elevation of these metals (Fe, Cu, and Zn) in the malignant tissue. The specimens were collected just after mastectomy of women aged 19 to 59 years from the hospitals of Islamabad and Rawalpindi, Pakistan. Most of the patients belong to urban areas of Pakistan. Findings of study depict that these elements have a promising role in the initiation and development of carcinoma as consistent pattern of elevation for Fe, Cu, and Zn was observed. The results showed the excessive accumulation of Fe (229 ± 121 mg/L) in malignant breast tissue samples of patients (p factor of breast cancer. In order to validate our method of analysis, certified reference material muscle tissue lyophilized (IAEA) MA-M-2/TM was analyzed for metal studied. Determined concentrations were quite in good agreement with certified levels. Asymmetric concentration distribution for Fe, Cu, and Zn was observed in both malignant and benign tissue samples.

  10. Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.

    Science.gov (United States)

    Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C

    2003-03-01

    Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.

  11. Risk and protective factors of dissocial behavior in a probability sample.

    Science.gov (United States)

    Moral de la Rubia, José; Ortiz Morales, Humberto

    2012-07-01

    The aims of this study were to know risk and protective factors for dissocial behavior keeping in mind that the self-report of dissocial behavior is biased by the impression management. A probability sample of adolescents that lived in two neighborhoods with high indexes of gangs and offenses (112 male and 86 women) was collected. The 27-item Dissocial Behavior Scale (ECODI27; Pacheco & Moral, 2010), Balanced Inventory of Desirable Responding, version 6 (BIDR-6; Paulhus, 1991), Sensation Seeking Scale, form V (SSS-V; Zuckerman, Eysenck, & Eysenck, 1978), Parent-Adolescent Communication Scale (PACS; Barnes & Olson, 1982), 30-item Rathus Assertiveness Schedule (RAS; Rathus, 1973), Interpersonal Reactivity Index (IRI; Davis, 1983) and a social relationship questionnaire (SRQ) were applied. Binary logistic regression was used for the data analysis. A third of the participants showed dissocial behavior. Belonging to a gang in the school (schooled adolescents) or to a gang out of school and job (total sample) and desinhibition were risk factors; being woman, perspective taking and open communication with the father were protective factors. School-leaving was a differential aspect. We insisted on the need of intervention on these variables.

  12. BMR in a Brazilian adult probability sample: the Nutrition, Physical Activity and Health Survey.

    Science.gov (United States)

    Anjos, Luiz A; Wahrlich, Vivian; Vasconcellos, Mauricio Tl

    2014-04-01

    To measure BMR in a probability sample of adults from an urban city of Brazil and to compare indirectly measured BMR (BMRi) with BMR predicted from different equations. BMR data were obtained by indirect calorimetry and estimated by different predictive equations (Schofield; Harris and Benedict; Henry and Rees). Anthropometric and body composition measures were also obtained. The Nutrition, Physical Activity and Health Survey (PNAFS), a household survey conducted in Niterói, Rio de Janeiro state, Brazil. Representative sample of 529 adults (aged ≥20 years; 339 females) living in Niterói, Rio de Janeiro state, Brazil. Mean BMRi values were 5839.7 (se 73.9) kJ/d and 4758.1 (se 39.5) kJ/d for men and women, respectively. Predicted BMR by all equations was significantly higher (difference between means and 95% CI did not include zero) than BMRi in both men and women of all ages. Overall bias in BMR (predicted BMR minus BMRi) using the Schofield equations (overestimation of about 20%) was higher than when using the Henry and Rees equations (13% and 16% overestimation for males and females, respectively). The percentage of individuals whose BMR predicted by the Schofield equations fell within 10% of BMRi was very low (7.8% and 14.1% of males nd females, respectively). Current available predictive equations of BMR are not adequate to estimate BMR in Brazilians living in Niterói, Rio de Janeiro, Brazil.

  13. Epidemiology of undiagnosed trichomoniasis in a probability sample of urban young adults.

    Directory of Open Access Journals (Sweden)

    Susan M Rogers

    Full Text Available T. vaginalis infection (trichomoniasis is the most common curable sexually transmitted infection (STI in the U.S. It is associated with increased HIV risk and adverse pregnancy outcomes. Trichomoniasis surveillance data do not exist for either national or local populations. The Monitoring STIs Survey Program (MSSP collected survey data and specimens which were tested using nucleic acid amplification tests to monitor trichomoniasis and other STIs in 2006-09 among a probability sample of young adults (N = 2,936 in Baltimore, Maryland--an urban area with high rates of reported STIs. The estimated prevalence of trichomoniasis was 7.5% (95% CI 6.3, 9.1 in the overall population and 16.1% (95% CI 13.0, 19.8 among Black women. The overwhelming majority of infected men (98.5% and women (73.3% were asymptomatic. Infections were more common in both women (OR = 3.6, 95% CI 1.6, 8.2 and men (OR = 9.0, 95% CI 1.8, 44.3 with concurrent chlamydial infection. Trichomoniasis did not vary significantly by age for either men or women. Women with two or more partners in the past year and women with a history of personal or partner incarceration were more likely to have an infection. Overall, these results suggest that routine T vaginalis screening in populations at elevated risk of infection should be considered.

  14. Romantic Relationship Characteristics and Adolescent Relationship Abuse in a Probability-Based Sample of Youth.

    Science.gov (United States)

    Taylor, Bruce; Joseph, Hannah; Mumford, Elizabeth

    2017-09-01

    This study examines the longitudinal association between baseline adolescent romantic relationship characteristics and later adolescent relationship abuse (ARA). Data are from the first two waves of the National Survey on Teen Relationships and Intimate Violence (STRiV). Girls and boys ages 10 to 18 were recruited randomly from the children of adults participating in a larger national household probability sample panel. About three quarters of the sample identified as White, non-Hispanic. Controlling behavior by a romantic partner consistently predicted later ARA. Higher levels of controlling behavior in the relationship was associated with higher rates of sexual and/or physical ARA victimization and higher rates for similar acts of perpetration. More controlling behavior by the partner was also associated with higher rates of psychological ARA victimization (and higher rates for psychological ARA perpetration). Our results suggest that ARA prevention programs should have explicit discussions of the deleterious effects of controlling behavior with adolescents. Respondents reporting higher feelings of passionate love were also at higher risk of experiencing sexual and/or physical ARA victimization. This finding will need to be considered by clinicians and prevention specialist in their work with youth as a potential risk marker for ARA. Baseline reports of at least one form of ARA were predictive of 1-year follow-up rates of ARA in all of our models, underscoring a long line of research that past aggressive or violent behavior is one of the strongest predictors of current aggressive or violent behavior. We also observed that female respondents were twice as likely to be perpetrators of physical and/or sexual ARA as male respondents. Prevention messaging often is focused on girls as ARA victims and our results imply that messaging should also be directed toward girls as perpetrators.

  15. Marital disruption is associated with shorter salivary telomere length in a probability sample of older adults.

    Science.gov (United States)

    Whisman, Mark A; Robustelli, Briana L; Sbarra, David A

    2016-05-01

    Marital disruption (i.e., marital separation, divorce) is associated with a wide range of poor mental and physical health outcomes, including increased risk for all-cause mortality. One biological intermediary that may help explain the association between marital disruption and poor health is accelerated cellular aging. This study examines the association between marital disruption and salivary telomere length in a United States probability sample of adults ≥50 years of age. Participants were 3526 individuals who participated in the 2008 wave of the Health and Retirement Study. Telomere length assays were performed using quantitative real-time polymerase chain reaction (qPCR) on DNA extracted from saliva samples. Health and lifestyle factors, traumatic and stressful life events, and neuroticism were assessed via self-report. Linear regression analyses were conducted to examine the associations between predictor variables and salivary telomere length. Based on their marital status data in the 2006 wave, people who were separated or divorced had shorter salivary telomeres than people who were continuously married or had never been married, and the association between marital disruption and salivary telomere length was not moderated by gender or neuroticism. Furthermore, the association between marital disruption and salivary telomere length remained statistically significant after adjusting for demographic and socioeconomic variables, neuroticism, cigarette use, body mass, traumatic life events, and other stressful life events. Additionally, results revealed that currently married adults with a history of divorce evidenced shorter salivary telomeres than people who were continuously married or never married. Accelerated cellular aging, as indexed by telomere shortening, may be one pathway through which marital disruption is associated with morbidity and mortality. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Male Circumcision and STI Acquisition in Britain: Evidence from a National Probability Sample Survey.

    Directory of Open Access Journals (Sweden)

    Virginia Homfray

    Full Text Available It is well-established that male circumcision reduces acquisition of HIV, herpes simplex virus 2, chancroid, and syphilis. However, the effect on the acquisition of non-ulcerative sexually transmitted infections (STIs remains unclear. We examined the relationship between circumcision and biological measures of three STIs: human papillomavirus (HPV, Chlamydia trachomatis and Mycoplasma genitalium.A probability sample survey of 15,162 men and women aged 16-74 years (including 4,060 men aged 16-44 years was carried out in Britain between 2010 and 2012. Participants completed a computer-assisted personal interview, including a computer-assisted self-interview, which asked about experience of STI diagnoses, and circumcision. Additionally, 1,850 urine samples from sexually-experienced men aged 16-44 years were collected and tested for STIs. Multivariable logistic regression was used to calculate adjusted odds ratios (AOR to quantify associations between circumcision and i self-reporting any STI diagnosis and ii presence of STIs in urine, in men aged 16-44 years, adjusting for key socio-demographic and sexual behavioural factors.The prevalence of circumcision in sexually-experienced men aged 16-44 years was 17.4% (95%CI 16.0-19.0. There was no association between circumcision and reporting any previous STI diagnoses, and specifically previous chlamydia or genital warts. However, circumcised men were less likely to have any HPV type (AOR 0.26, 95% confidence interval (CI 0.13-0.50 including high-risk HPV types (HPV-16, 18, 31, 33, 35, 39, 45, 51, 52, 56, 58, 59 and/or 68 (AOR 0.14, 95% CI 0.05-0.40 detected in urine.Circumcised men had reduced odds of HPV detection in urine. These findings have implications for improving the precision of models of STI transmission in populations with different circumcision prevalence and in designing interventions to reduce STI acquisition.

  17. Stratified Entomological Sampling in Preparation for an Area-Wide Integrated Pest Management Program: The Example of Glossina palpalis gambiensis (Diptera: Glossinidae) in the Niayes of Senegal

    International Nuclear Information System (INIS)

    Bouyer, Jeremy; Seck, Momar Talla; Guerrini, Laure; Sall, Baba; Ndiaye, Elhadji Youssou; Vreysen, Marc J.B.

    2010-01-01

    The riverine tsetse species Glossina palpalis gambiensis Vanderplank 1949 (Diptera: Glossinidae) inhabits riparian forests along river systems in West Africa. The government of Senegal has embarked on a project to eliminate this tsetse species, and African animal trypanosomoses, from the Niayes are using an area-wide integrated pest management approach. A stratified entomological sampling strategy was therefore developed using spatial analytical tools and mathematical modeling. A preliminary phytosociological census identified eight types of suitable habitat, which could be discriminated from LandSat 7ETM satellite images and denominated wet areas. At the end of March 2009, 683 unbaited Vavoua traps had been deployed, and the observed infested area in the Niayes was 525 km2. In the remaining area, a mathematical model was used to assess the risk that flies were present despite a sequence of zero catches. The analysis showed that this risk was above 0.05 in19% of this area that will be considered as infested during the control operations.The remote sensing analysis that identifed the wet areas allowed a restriction of the area to be surveyed to 4% of the total surface area (7,150km2), whereas the mathematical model provided an efficient method to improve the accuracy and the robustness of the sampling protocol. The final size of the control area will be decided based on the entomological collection data.This entomological sampling procedure might be used for other vector or pest control scenarios. (Authors)

  18. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    Science.gov (United States)

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (nresearchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  19. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  20. Design of dry sand soil stratified sampler

    Science.gov (United States)

    Li, Erkang; Chen, Wei; Feng, Xiao; Liao, Hongbo; Liang, Xiaodong

    2018-04-01

    This paper presents a design of a stratified sampler for dry sand soil, which can be used for stratified sampling of loose sand under certain conditions. Our group designed the mechanical structure of a portable, single - person, dry sandy soil stratified sampler. We have set up a mathematical model for the sampler. It lays the foundation for further development of design research.

  1. Prevalence of probable Attention-Deficit/Hyperactivity Disorder symptoms: result from a Spanish sample of children.

    Science.gov (United States)

    Cerrillo-Urbina, Alberto José; García-Hermoso, Antonio; Martínez-Vizcaíno, Vicente; Pardo-Guijarro, María Jesús; Ruiz-Hermosa, Abel; Sánchez-López, Mairena

    2018-03-15

    The aims of our study were to: (i) determine the prevalence of children aged 4 to 6 years with probable Attention-Deficit/Hyperactivity Disorder (ADHD) symptoms in the Spanish population; and (ii) analyse the association of probable ADHD symptoms with sex, age, type of school, origin (native or foreign) and socio-economic status in these children. This cross-sectional study included 1189 children (4 to 6 years-old) from 21 primary schools in 19 towns from the Ciudad Real and Cuenca provinces, Castilla-La Mancha region, Spain. The ADHD Rating Scales IV for parents and teachers was administered to determine the probability of ADHD. The 90th percentile cut-off was used to establish the prevalence of inattention, hyperactivity/impulsivity and combined subtype. The prevalence of children with probable ADHD symptoms was 5.4% (2.6% inattention subtype symptoms, 1.5% hyperactivity/impulsivity subtype symptoms, and 1.3% combined subtype symptoms). Children aged 4 to 5 years showed a higher prevalence of probable ADHD in the inattention subtype symptoms and in total of all subtypes than children aged 6 years, and children with low socio-economic status reported a higher prevalence of probable ADHD symptoms (each subtype and total of all of them) than those with medium and high socio-economic status. Early diagnosis and an understanding of the predictors of being probable ADHD are needed to direct appropriate identification and intervention efforts. These screening efforts should be especially addressed to vulnerable groups, particularly low socio-economic status families and younger children.

  2. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  3. Loaded dice in Monte Carlo : importance sampling in phase space integration and probability distributions for discrepancies

    NARCIS (Netherlands)

    Hameren, Andreas Ferdinand Willem van

    2001-01-01

    Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of

  4. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  5. Effect of optimum stratification on sampling with varying probabilities under proportional allocation

    OpenAIRE

    Syed Ejaz Husain Rizvi; Jaj P. Gupta; Manoj Bhargava

    2007-01-01

    The problem of optimum stratification on an auxiliary variable when the units from different strata are selected with probability proportional to the value of auxiliary variable (PPSWR) was considered by Singh (1975) for univariate case. In this paper we have extended the same problem, for proportional allocation, when two variates are under study. A cum. 3 R3(x) rule for obtaining approximately optimum strata boundaries has been provided. It has been shown theoretically as well as empiricall...

  6. Sampling little fish in big rivers: Larval fish detection probabilities in two Lake Erie tributaries and implications for sampling effort and abundance indices

    Science.gov (United States)

    Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Roseman, Edward F.; DeBruyne, Robin L.

    2014-01-01

    Larval fish are frequently sampled in coastal tributaries to determine factors affecting recruitment, evaluate spawning success, and estimate production from spawning habitats. Imperfect detection of larvae is common, because larval fish are small and unevenly distributed in space and time, and coastal tributaries are often large and heterogeneous. We estimated detection probabilities of larval fish from several taxa in the Maumee and Detroit rivers, the two largest tributaries of Lake Erie. We then demonstrated how accounting for imperfect detection influenced (1) the probability of observing taxa as present relative to sampling effort and (2) abundance indices for larval fish of two Detroit River species. We found that detection probabilities ranged from 0.09 to 0.91 but were always less than 1.0, indicating that imperfect detection is common among taxa and between systems. In general, taxa with high fecundities, small larval length at hatching, and no nesting behaviors had the highest detection probabilities. Also, detection probabilities were higher in the Maumee River than in the Detroit River. Accounting for imperfect detection produced up to fourfold increases in abundance indices for Lake Whitefish Coregonus clupeaformis and Gizzard Shad Dorosoma cepedianum. The effect of accounting for imperfect detection in abundance indices was greatest during periods of low abundance for both species. Detection information can be used to determine the appropriate level of sampling effort for larval fishes and may improve management and conservation decisions based on larval fish data.

  7. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  8. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung; Liang, Faming; Chen, Yuguo; Yu, Kai

    2013-01-01

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  9. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    Science.gov (United States)

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  10. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  11. Statistical model for degraded DNA samples and adjusted probabilities for allelic drop-out

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    Abstract DNA samples found at a scene of crime or obtained from the debris of a mass disaster accident are often subject to degradation. When using the STR DNA technology, the DNA profile is observed via a so-called electropherogram (EPG), where the alleles are identified as signal peaks above...... data from degraded DNA, where cases with varying amounts of DNA and levels of degradation are investigated....

  12. Statistical model for degraded DNA samples and adjusted probabilities for allelic drop-out

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    DNA samples found at a scene of crime or obtained from the debris of a mass disaster accident are often subject to degradation. When using the STR DNA technology, the DNA profile is observed via a so-called electropherogram (EPG), where the alleles are identified as signal peaks above a certain...... data from degraded DNA, where cases with varying amounts of DNA and levels of degradation are investigated....

  13. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    OpenAIRE

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective: To examine sociodemographic and behavioural differences between men whohave sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey.\\ud Methods: We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men inthe same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European...

  14. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  15. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys.

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-09-01

    To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  17. Propagating Mixed Uncertainties in Cyber Attacker Payoffs: Exploration of Two-Phase Monte Carlo Sampling and Probability Bounds Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.; Halappanavar, Mahantesh

    2016-09-16

    Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker and system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.

  18. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  19. Attitudes toward Bisexual Men and Women among a Nationally Representative Probability Sample of Adults in the United States

    Science.gov (United States)

    Herbenick, Debby; Friedman, M. Reuel; Schick, Vanessa; Fu, Tsung-Chieh (Jane); Bostwick, Wendy; Bartelt, Elizabeth; Muñoz-Laboy, Miguel; Pletta, David; Reece, Michael; Sandfort, Theo G. M.

    2016-01-01

    As bisexual individuals in the United States (U.S.) face significant health disparities, researchers have posited that these differences may be fueled, at least in part, by negative attitudes, prejudice, stigma, and discrimination toward bisexual individuals from heterosexual and gay/lesbian individuals. Previous studies of individual and social attitudes toward bisexual men and women have been conducted almost exclusively with convenience samples, with limited generalizability to the broader U.S. population. Our study provides an assessment of attitudes toward bisexual men and women among a nationally representative probability sample of heterosexual, gay, lesbian, and other-identified adults in the U.S. Data were collected from the 2015 National Survey of Sexual Health and Behavior (NSSHB), via an online questionnaire with a probability sample of adults (18 years and over) from throughout the U.S. We included two modified 5-item versions of the Bisexualities: Indiana Attitudes Scale (BIAS), validated sub-scales that were developed to measure attitudes toward bisexual men and women. Data were analyzed using descriptive statistics, gamma regression, and paired t-tests. Gender, sexual identity, age, race/ethnicity, income, and educational attainment were all significantly associated with participants' attitudes toward bisexual individuals. In terms of responses to individual scale items, participants were most likely to “neither agree nor disagree” with all attitudinal statements. Across sexual identities, self-identified other participants reported the most positive attitudes, while heterosexual male participants reported the least positive attitudes. As in previous research on convenience samples, we found a wide range of demographic characteristics were related with attitudes toward bisexual individuals in our nationally-representative study of heterosexual, gay/lesbian, and other-identified adults in the U.S. In particular, gender emerged as a significant

  20. Attitudes toward Bisexual Men and Women among a Nationally Representative Probability Sample of Adults in the United States.

    Science.gov (United States)

    Dodge, Brian; Herbenick, Debby; Friedman, M Reuel; Schick, Vanessa; Fu, Tsung-Chieh Jane; Bostwick, Wendy; Bartelt, Elizabeth; Muñoz-Laboy, Miguel; Pletta, David; Reece, Michael; Sandfort, Theo G M

    2016-01-01

    As bisexual individuals in the United States (U.S.) face significant health disparities, researchers have posited that these differences may be fueled, at least in part, by negative attitudes, prejudice, stigma, and discrimination toward bisexual individuals from heterosexual and gay/lesbian individuals. Previous studies of individual and social attitudes toward bisexual men and women have been conducted almost exclusively with convenience samples, with limited generalizability to the broader U.S. Our study provides an assessment of attitudes toward bisexual men and women among a nationally representative probability sample of heterosexual, gay, lesbian, and other-identified adults in the U.S. Data were collected from the 2015 National Survey of Sexual Health and Behavior (NSSHB), via an online questionnaire with a probability sample of adults (18 years and over) from throughout the U.S. We included two modified 5-item versions of the Bisexualities: Indiana Attitudes Scale (BIAS), validated sub-scales that were developed to measure attitudes toward bisexual men and women. Data were analyzed using descriptive statistics, gamma regression, and paired t-tests. Gender, sexual identity, age, race/ethnicity, income, and educational attainment were all significantly associated with participants' attitudes toward bisexual individuals. In terms of responses to individual scale items, participants were most likely to "neither agree nor disagree" with all attitudinal statements. Across sexual identities, self-identified other participants reported the most positive attitudes, while heterosexual male participants reported the least positive attitudes. As in previous research on convenience samples, we found a wide range of demographic characteristics were related with attitudes toward bisexual individuals in our nationally-representative study of heterosexual, gay/lesbian, and other-identified adults in the U.S. In particular, gender emerged as a significant characteristic

  1. Sexual behaviors, relationships, and perceived health status among adult women in the United States: results from a national probability sample.

    Science.gov (United States)

    Herbenick, Debby; Reece, Michael; Schick, Vanessa; Sanders, Stephanie A; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    Past surveys of sexual behavior have demonstrated that female sexual behavior is influenced by medical and sociocultural changes. To be most attentive to women and their sexual lives, it is important to have an understanding of the continually evolving sexual behaviors of contemporary women in the United States. The purpose of this study, the National Survey of Sexual Health and Behavior (NSSHB), was to, in a national probability survey of women ages 18-92, assess the proportion of women in various age cohorts who had engaged in solo and partnered sexual activities in the past 90 days and to explore associations with participants' sexual behavior and their relationship and perceived health status. Past year frequencies of masturbation, vaginal intercourse, and anal intercourse were also assessed. A national probability sample of 2,523 women ages 18 to 92 completed a cross-sectional internet based survey about their sexual behavior. Relationship status; perceived health status; experience of solo masturbation, partnered masturbation, giving oral sex, receiving oral sex, vaginal intercourse, anal intercourse, in the past 90 days; frequency of solo masturbation, vaginal intercourse, and anal intercourse in the past year. Recent solo masturbation, oral sex, and vaginal intercourse were prevalent among women, decreased with age, and varied in their associations with relationship and perceived health status. Recent anal sex and same-sex oral sex were uncommonly reported. Solo masturbation was most frequent among women ages 18 to 39, vaginal intercourse was most frequent among women ages 18 to 29 and anal sex was infrequently reported. Contemporary women in the United States engage in a diverse range of solo and partnered sexual activities, though sexual behavior is less common and more infrequent among older age cohorts. © 2010 International Society for Sexual Medicine.

  2. Sexual behaviors, relationships, and perceived health among adult men in the United States: results from a national probability sample.

    Science.gov (United States)

    Reece, Michael; Herbenick, Debby; Schick, Vanessa; Sanders, Stephanie A; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    To provide a foundation for those who provide sexual health services and programs to men in the United States, the need for population-based data that describes men's sexual behaviors and their correlates remains. The purpose of this study was to, in a national probability survey of men ages 18-94 years, assess the occurrence and frequency of sexual behaviors and their associations with relationship status and health status. A national probability sample of 2,522 men aged 18 to 94 completed a cross-sectional survey about their sexual behaviors, relationship status, and health. Relationship status; health status; experience of solo masturbation, partnered masturbation, giving oral sex, receiving oral sex, vaginal intercourse and anal intercourse, in the past 90 days; frequency of solo masturbation, vaginal intercourse and anal intercourse in the past year. Masturbation, oral intercourse, and vaginal intercourse are prevalent among men throughout most of their adult life, with both occurrence and frequency varying with age and as functions of relationship type and physical health status. Masturbation is prevalent and frequent across various stages of life and for both those with and without a relational partner, with fewer men with fair to poor health reporting recent masturbation. Patterns of giving oral sex to a female partner were similar to those for receiving oral sex. Vaginal intercourse in the past 90 days was more prevalent among men in their late 20s and 30s than in the other age groups, although being reported by approximately 50% of men in the sixth and seventh decades of life. Anal intercourse and sexual interactions with other men were less common than all other sexual behaviors. Contemporary men in the United States engage in diverse solo and partnered sexual activities; however, sexual behavior is less common and more infrequent among older age cohorts. © 2010 International Society for Sexual Medicine.

  3. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  4. Comparing rapid methods for detecting Listeria in seafood and environmental samples using the most probably number (MPN) technique.

    Science.gov (United States)

    Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C

    2012-02-15

    The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Condom use during most recent vaginal intercourse event among a probability sample of adults in the United States.

    Science.gov (United States)

    Sanders, Stephanie A; Reece, Michael; Herbenick, Debby; Schick, Vanessa; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    Correct and consistent condom use remains the most effective way to reduce sexually transmissible infection/HIV transmission during sex and is a highly effective contraceptive method. Understanding correlates of condom use is vital to public health programs. To explore sociodemographic, event characteristics, and experiential correlates of condom use at last penile-vaginal intercourse (PVI). Survey data were collected from a nationally representative probability sample of adults in the United States as part of the National Survey of Sexual Health and Behavior. Condom use/non-use at most recent PVI was the main outcome. Logistic regression analyses predicted condom use from sociodemographic variables (i.e., age, education, race/ethnicity, sexual orientation, health status, type of relationship with sexual partner) and event characteristics (i.e., location of sexual encounter, prior intercourse experience with partner, whether partner had other sex partners in the 6 months prior to sex with the participant; other contraceptive use, alcohol use, marijuana use, and for men, erection medication use). Logistic regression analyses examined evaluations of the sexual aspects of the experience (pleasure, arousal, erection/lubrication difficulty, participant orgasm, partner orgasm) in light of condom use. Condom-protected PVI was significantly greater among younger people, blacks and Hispanics, and those having PVI with a nonrelationship partner. Statistically adjusting for these differences, condom use was significantly associated with fewer previous intercourse experiences with the partner and not using other forms of contraception. The sexual aspects of experience were evaluated similarly regardless of whether or not a condom was used. Public health programs among youths and minorities may underlie higher condom use rates among these groups. Condom use may be further improved by continuing such programs and also expanding outreach to older persons and whites, suggesting

  6. Sexual behavior in the United States: results from a national probability sample of men and women ages 14-94.

    Science.gov (United States)

    Herbenick, Debby; Reece, Michael; Schick, Vanessa; Sanders, Stephanie A; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    Despite a demonstrated relationship between sexual behaviors and health, including clinical risks, little is known about contemporary sexual behavior. To assess the rates of sexual behavior among adolescents and adults in the United States. We report the recent (past month, past year) and lifetime prevalence of sexual behaviors in a nationally representative probability sample of 5,865 men and women ages 14 to 94 in the United States (2,936 men, 2,929 women). Behaviors assessed included solo masturbation, partnered masturbation, giving and receiving oral sex, vaginal intercourse, and anal intercourse. Masturbation was common throughout the lifespan and more common than partnered sexual activities during adolescence and older age (70+). Although uncommon among 14- to 15-year olds, in the past year 18.3% of 16- to 17-year-old males and 22.4% of 16- to 17-year-old females performed oral sex with an other-sex partner. Also in the past year, more than half of women and men ages 18 to 49 engaged in oral sex. The proportion of adults who reported vaginal sex in the past year was highest among men ages 25-39 and for women ages 20-29, then progressively declined among older age groups. More than 20% of men ages 25-49 and women ages 20-39 reported anal sex in the past year. Same-sex sexual behaviors occurring in the past year were uncommonly reported. Men and women engage in a diverse range of solo and partnered sexual behaviors throughout the life course. The rates of contemporary sexual behavior provided in this report will be valuable to those who develop, implement, and evaluate programs that seek to improve societal knowledge related to the prevalence of sexual behaviors and to sexual health clinicians whose work to improve sexual health among the population often requires such rates of behavior. © 2010 International Society for Sexual Medicine.

  7. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  8. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    Science.gov (United States)

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  9. The Work Sample Verification and the Calculation of the Statistical, Mathematical and Economical Probability for the Risks of the Direct Procurement

    Directory of Open Access Journals (Sweden)

    Lazăr Cristiana Daniela

    2017-01-01

    Full Text Available Each organization has among its multiple secondary endpoints subordinated to a centralobjective that one of avoiding the contingencies. The direct procurement is carried out on themarket in SEAP (Electronic System of Public Procurement, and a performing management in apublic institution has as sub-base and risk management. The risks may be investigated byeconometric simulation, which is calculated by the use of calculus of probability and the sample fordetermining the relevance of these probabilities.

  10. Prevalence and Risk Factors of Overweight and Obesity among Children Aged 6–59 Months in Cameroon: A Multistage, Stratified Cluster Sampling Nationwide Survey

    Science.gov (United States)

    Tchoubi, Sébastien; Sobngwi-Tambekou, Joëlle; Noubiap, Jean Jacques N.; Asangbeh, Serra Lem; Nkoum, Benjamin Alexandre; Sobngwi, Eugene

    2015-01-01

    Background Childhood obesity is one of the most serious public health challenges of the 21st century. The prevalence of overweight and obesity among children (overweight and obesity among children aged 6 months to 5 years in Cameroon in 2011. Methods Four thousand five hundred and eighteen children (2205 boys and 2313 girls) aged between 6 to 59 months were sampled in the 2011 Demographic Health Survey (DHS) database. Body Mass Index (BMI) z-scores based on WHO 2006 reference population was chosen to estimate overweight (BMI z-score > 2) and obesity (BMI for age > 3). Regression analyses were performed to investigate risk factors of overweight/obesity. Results The prevalence of overweight and obesity was 8% (1.7% for obesity alone). Boys were more affected by overweight than girls with a prevalence of 9.7% and 6.4% respectively. The highest prevalence of overweight was observed in the Grassfield area (including people living in West and North-West regions) (15.3%). Factors that were independently associated with overweight and obesity included: having overweight mother (adjusted odds ratio (aOR) = 1.51; 95% CI 1.15 to 1.97) and obese mother (aOR = 2.19; 95% CI = 155 to 3.07), compared to having normal weight mother; high birth weight (aOR = 1.69; 95% CI 1.24 to 2.28) compared to normal birth weight; male gender (aOR = 1.56; 95% CI 1.24 to 1.95); low birth rank (aOR = 1.35; 95% CI 1.06 to 1.72); being aged between 13–24 months (aOR = 1.81; 95% CI = 1.21 to 2.66) and 25–36 months (aOR = 2.79; 95% CI 1.93 to 4.13) compared to being aged 45 to 49 months; living in the grassfield area (aOR = 2.65; 95% CI = 1.87 to 3.79) compared to living in Forest area. Muslim appeared as a protective factor (aOR = 0.67; 95% CI 0.46 to 0.95).compared to Christian religion. Conclusion This study underlines a high prevalence of early childhood overweight with significant disparities between ecological areas of Cameroon. Risk factors of overweight included high maternal BMI, high

  11. Prevalence and Risk Factors of Overweight and Obesity among Children Aged 6-59 Months in Cameroon: A Multistage, Stratified Cluster Sampling Nationwide Survey.

    Science.gov (United States)

    Tchoubi, Sébastien; Sobngwi-Tambekou, Joëlle; Noubiap, Jean Jacques N; Asangbeh, Serra Lem; Nkoum, Benjamin Alexandre; Sobngwi, Eugene

    2015-01-01

    Childhood obesity is one of the most serious public health challenges of the 21st century. The prevalence of overweight and obesity among children (overweight and obesity among children aged 6 months to 5 years in Cameroon in 2011. Four thousand five hundred and eighteen children (2205 boys and 2313 girls) aged between 6 to 59 months were sampled in the 2011 Demographic Health Survey (DHS) database. Body Mass Index (BMI) z-scores based on WHO 2006 reference population was chosen to estimate overweight (BMI z-score > 2) and obesity (BMI for age > 3). Regression analyses were performed to investigate risk factors of overweight/obesity. The prevalence of overweight and obesity was 8% (1.7% for obesity alone). Boys were more affected by overweight than girls with a prevalence of 9.7% and 6.4% respectively. The highest prevalence of overweight was observed in the Grassfield area (including people living in West and North-West regions) (15.3%). Factors that were independently associated with overweight and obesity included: having overweight mother (adjusted odds ratio (aOR) = 1.51; 95% CI 1.15 to 1.97) and obese mother (aOR = 2.19; 95% CI = 155 to 3.07), compared to having normal weight mother; high birth weight (aOR = 1.69; 95% CI 1.24 to 2.28) compared to normal birth weight; male gender (aOR = 1.56; 95% CI 1.24 to 1.95); low birth rank (aOR = 1.35; 95% CI 1.06 to 1.72); being aged between 13-24 months (aOR = 1.81; 95% CI = 1.21 to 2.66) and 25-36 months (aOR = 2.79; 95% CI 1.93 to 4.13) compared to being aged 45 to 49 months; living in the grassfield area (aOR = 2.65; 95% CI = 1.87 to 3.79) compared to living in Forest area. Muslim appeared as a protective factor (aOR = 0.67; 95% CI 0.46 to 0.95).compared to Christian religion. This study underlines a high prevalence of early childhood overweight with significant disparities between ecological areas of Cameroon. Risk factors of overweight included high maternal BMI, high birth weight, male gender, low birth rank

  12. Transgender Population Size in the United States: a Meta-Regression of Population-Based Probability Samples

    Science.gov (United States)

    Sevelius, Jae M.

    2017-01-01

    Background. Transgender individuals have a gender identity that differs from the sex they were assigned at birth. The population size of transgender individuals in the United States is not well-known, in part because official records, including the US Census, do not include data on gender identity. Population surveys today more often collect transgender-inclusive gender-identity data, and secular trends in culture and the media have created a somewhat more favorable environment for transgender people. Objectives. To estimate the current population size of transgender individuals in the United States and evaluate any trend over time. Search methods. In June and July 2016, we searched PubMed, Cumulative Index to Nursing and Allied Health Literature, and Web of Science for national surveys, as well as “gray” literature, through an Internet search. We limited the search to 2006 through 2016. Selection criteria. We selected population-based surveys that used probability sampling and included self-reported transgender-identity data. Data collection and analysis. We used random-effects meta-analysis to pool eligible surveys and used meta-regression to address our hypothesis that the transgender population size estimate would increase over time. We used subsample and leave-one-out analysis to assess for bias. Main results. Our meta-regression model, based on 12 surveys covering 2007 to 2015, explained 62.5% of model heterogeneity, with a significant effect for each unit increase in survey year (F = 17.122; df = 1,10; b = 0.026%; P = .002). Extrapolating these results to 2016 suggested a current US population size of 390 adults per 100 000, or almost 1 million adults nationally. This estimate may be more indicative for younger adults, who represented more than 50% of the respondents in our analysis. Authors’ conclusions. Future national surveys are likely to observe higher numbers of transgender people. The large variety in questions used to ask

  13. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  14. The stratified Boycott effect

    Science.gov (United States)

    Peacock, Tom; Blanchette, Francois; Bush, John W. M.

    2005-04-01

    We present the results of an experimental investigation of the flows generated by monodisperse particles settling at low Reynolds number in a stably stratified ambient with an inclined sidewall. In this configuration, upwelling beneath the inclined wall associated with the Boycott effect is opposed by the ambient density stratification. The evolution of the system is determined by the relative magnitudes of the container depth, h, and the neutral buoyancy height, hn = c0(ρp-ρf)/|dρ/dz|, where c0 is the particle concentration, ρp the particle density, ρf the mean fluid density and dρ/dz Boycott layer transports dense fluid from the bottom to the top of the system; subsequently, the upper clear layer of dense saline fluid is mixed by convection. For sufficiently strong stratification, h > hn, layering occurs. The lowermost layer is created by clear fluid transported from the base to its neutral buoyancy height, and has a vertical extent hn; subsequently, smaller overlying layers develop. Within each layer, convection erodes the initially linear density gradient, generating a step-like density profile throughout the system that persists after all the particles have settled. Particles are transported across the discrete density jumps between layers by plumes of particle-laden fluid.

  15. Development and application of a most probable number-PCR assay to quantify flagellate populations in soil samples

    DEFF Research Database (Denmark)

    Fredslund, Line; Ekelund, Flemming; Jacobsen, Carsten Suhr

    2001-01-01

    This paper reports on the first successful molecular detection and quantification of soil protozoa. Quantification of heterotrophic flagellates and naked amoebae in soil has traditionally relied on dilution culturing techniques, followed by most-probable-number (MPN) calculations. Such methods...... are biased by differences in the culturability of soil protozoa and are unable to quantify specific taxonomic groups, and the results are highly dependent on the choice of media and the skills of the microscopists. Successful detection of protozoa in soil by DNA techniques requires (i) the development...

  16. Sexual diversity in the United States: Results from a nationally representative probability sample of adult women and men.

    Directory of Open Access Journals (Sweden)

    Debby Herbenick

    Full Text Available In 2015, we conducted a cross-sectional, Internet-based, U.S. nationally representative probability survey of 2,021 adults (975 men, 1,046 women focused on a broad range of sexual behaviors. Individuals invited to participate were from the GfK KnowledgePanel®. The survey was titled the 2015 Sexual Exploration in America Study and survey completion took about 12 to 15 minutes. The survey was confidential and the researchers never had access to respondents' identifiers. Respondents reported on demographic items, lifetime and recent sexual behaviors, and the appeal of 50+ sexual behaviors. Most (>80% reported lifetime masturbation, vaginal sex, and oral sex. Lifetime anal sex was reported by 43% of men (insertive and 37% of women (receptive. Common lifetime sexual behaviors included wearing sexy lingerie/underwear (75% women, 26% men, sending/receiving digital nude/semi-nude photos (54% women, 65% men, reading erotic stories (57% of participants, public sex (≥43%, role-playing (≥22%, tying/being tied up (≥20%, spanking (≥30%, and watching sexually explicit videos/DVDs (60% women, 82% men. Having engaged in threesomes (10% women, 18% men and playful whipping (≥13% were less common. Lifetime group sex, sex parties, taking a sexuality class/workshop, and going to BDSM parties were uncommon (each <8%. More Americans identified behaviors as "appealing" than had engaged in them. Romantic/affectionate behaviors were among those most commonly identified as appealing for both men and women. The appeal of particular behaviors was associated with greater odds that the individual had ever engaged in the behavior. This study contributes to our understanding of more diverse adult sexual behaviors than has previously been captured in U.S. nationally representative probability surveys. Implications for sexuality educators, clinicians, and individuals in the general population are discussed.

  17. Sexual diversity in the United States: Results from a nationally representative probability sample of adult women and men

    Science.gov (United States)

    Herbenick, Debby; Bowling, Jessamyn; Fu, Tsung-Chieh (Jane); Guerra-Reyes, Lucia; Sanders, Stephanie

    2017-01-01

    In 2015, we conducted a cross-sectional, Internet-based, U.S. nationally representative probability survey of 2,021 adults (975 men, 1,046 women) focused on a broad range of sexual behaviors. Individuals invited to participate were from the GfK KnowledgePanel®. The survey was titled the 2015 Sexual Exploration in America Study and survey completion took about 12 to 15 minutes. The survey was confidential and the researchers never had access to respondents’ identifiers. Respondents reported on demographic items, lifetime and recent sexual behaviors, and the appeal of 50+ sexual behaviors. Most (>80%) reported lifetime masturbation, vaginal sex, and oral sex. Lifetime anal sex was reported by 43% of men (insertive) and 37% of women (receptive). Common lifetime sexual behaviors included wearing sexy lingerie/underwear (75% women, 26% men), sending/receiving digital nude/semi-nude photos (54% women, 65% men), reading erotic stories (57% of participants), public sex (≥43%), role-playing (≥22%), tying/being tied up (≥20%), spanking (≥30%), and watching sexually explicit videos/DVDs (60% women, 82% men). Having engaged in threesomes (10% women, 18% men) and playful whipping (≥13%) were less common. Lifetime group sex, sex parties, taking a sexuality class/workshop, and going to BDSM parties were uncommon (each <8%). More Americans identified behaviors as “appealing” than had engaged in them. Romantic/affectionate behaviors were among those most commonly identified as appealing for both men and women. The appeal of particular behaviors was associated with greater odds that the individual had ever engaged in the behavior. This study contributes to our understanding of more diverse adult sexual behaviors than has previously been captured in U.S. nationally representative probability surveys. Implications for sexuality educators, clinicians, and individuals in the general population are discussed. PMID:28727762

  18. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    Directory of Open Access Journals (Sweden)

    Simon van Mourik

    2014-06-01

    Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.

  19. Gender Expression, Violence, and Bullying Victimization: Findings from Probability Samples of High School Students in 4 US School Districts

    Science.gov (United States)

    Gordon, Allegra R.; Conron, Kerith J.; Calzo, Jerel P.; White, Matthew T.; Reisner, Sari L.; Austin, S. Bryn

    2018-01-01

    Background: Young people may experience school-based violence and bullying victimization related to their gender expression, independent of sexual orientation identity. However, the associations between gender expression and bullying and violence have not been examined in racially and ethnically diverse population-based samples of high school…

  20. Probability estimation of rare extreme events in the case of small samples: Technique and examples of analysis of earthquake catalogs

    Science.gov (United States)

    Pisarenko, V. F.; Rodkin, M. V.; Rukavishnikova, T. A.

    2017-11-01

    The most general approach to studying the recurrence law in the area of the rare largest events is associated with the use of limit law theorems of the theory of extreme values. In this paper, we use the Generalized Pareto Distribution (GPD). The unknown GPD parameters are typically determined by the method of maximal likelihood (ML). However, the ML estimation is only optimal for the case of fairly large samples (>200-300), whereas in many practical important cases, there are only dozens of large events. It is shown that in the case of a small number of events, the highest accuracy in the case of using the GPD is provided by the method of quantiles (MQs). In order to illustrate the obtained methodical results, we have formed the compiled data sets characterizing the tails of the distributions for typical subduction zones, regions of intracontinental seismicity, and for the zones of midoceanic (MO) ridges. This approach paves the way for designing a new method for seismic risk assessment. Here, instead of the unstable characteristics—the uppermost possible magnitude M max—it is recommended to use the quantiles of the distribution of random maxima for a future time interval. The results of calculating such quantiles are presented.

  1. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  2. Electromagnetic waves in stratified media

    CERN Document Server

    Wait, James R; Fock, V A; Wait, J R

    2013-01-01

    International Series of Monographs in Electromagnetic Waves, Volume 3: Electromagnetic Waves in Stratified Media provides information pertinent to the electromagnetic waves in media whose properties differ in one particular direction. This book discusses the important feature of the waves that enables communications at global distances. Organized into 13 chapters, this volume begins with an overview of the general analysis for the electromagnetic response of a plane stratified medium comprising of any number of parallel homogeneous layers. This text then explains the reflection of electromagne

  3. A scenario tree model for the Canadian Notifiable Avian Influenza Surveillance System and its application to estimation of probability of freedom and sample size determination.

    Science.gov (United States)

    Christensen, Jette; Stryhn, Henrik; Vallières, André; El Allaki, Farouk

    2011-05-01

    In 2008, Canada designed and implemented the Canadian Notifiable Avian Influenza Surveillance System (CanNAISS) with six surveillance activities in a phased-in approach. CanNAISS was a surveillance system because it had more than one surveillance activity or component in 2008: passive surveillance; pre-slaughter surveillance; and voluntary enhanced notifiable avian influenza surveillance. Our objectives were to give a short overview of two active surveillance components in CanNAISS; describe the CanNAISS scenario tree model and its application to estimation of probability of populations being free of NAI virus infection and sample size determination. Our data from the pre-slaughter surveillance component included diagnostic test results from 6296 serum samples representing 601 commercial chicken and turkey farms collected from 25 August 2008 to 29 January 2009. In addition, we included data from a sub-population of farms with high biosecurity standards: 36,164 samples from 55 farms sampled repeatedly over the 24 months study period from January 2007 to December 2008. All submissions were negative for Notifiable Avian Influenza (NAI) virus infection. We developed the CanNAISS scenario tree model, so that it will estimate the surveillance component sensitivity and the probability of a population being free of NAI at the 0.01 farm-level and 0.3 within-farm-level prevalences. We propose that a general model, such as the CanNAISS scenario tree model, may have a broader application than more detailed models that require disease specific input parameters, such as relative risk estimates. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  4. Stratified medicine and reimbursement issues

    NARCIS (Netherlands)

    Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten

    2012-01-01

    Stratified Medicine (SM) has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  7. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  8. Sexual risk as an outcome of social oppression: data from a probability sample of Latino gay men in three U.S. cities.

    Science.gov (United States)

    Díaz, Rafael M; Ayala, George; Bein, Edward

    2004-08-01

    Based on data from a probability sample of 912 Latino gay men in 3 U.S. cities, a multivariate model of sexual risk was tested, including experiences of homophobia, racism, and poverty as predictors. Participants reported multiple instances of verbal and physical abuse, rude mistreatment, and discrimination on account of their sexual orientation and their race or ethnicity. Many reported experiences of poverty, such as inability to pay for basic necessities of food or shelter. Men who reported more instances of social discrimination and financial hardship were more psychologically distressed and more likely to participate in "difficult" sexual situations, as predicted. Participation in difficult sexual situations mediates the effects of social oppression and psychological distress on sexual risk behavior. (c) 2004 APA

  9. A technique of evaluating most probable stochastic valuables from a small number of samples and their accuracies and degrees of confidence

    Energy Technology Data Exchange (ETDEWEB)

    Katoh, K [Ibaraki Pref. Univ. Health Sci., (Japan)

    1997-12-31

    A problem of estimating stochastic characteristics of a population from a small number of samples is solved as an inverse problem, from view point of information theory and with the Bayesian statistics. For both Poisson-process and Bernoulli-process, the most probable values of the characteristics of the mother population and their accuracies and degrees of confidence are successfully obtained. Mathematical expressions are given to the general case where a limit amount of information and/or knowledge with the stochastic characteristics are available and a special case where no a priori information nor knowledge are available. Mathematical properties of the solutions obtained, practical appreciation to the problem to radiation measurement are also discussed.

  10. The Stratified Legitimacy of Abortions.

    Science.gov (United States)

    Kimport, Katrina; Weitz, Tracy A; Freedman, Lori

    2016-12-01

    Roe v. Wade was heralded as an end to unequal access to abortion care in the United States. However, today, despite being common and safe, abortion is performed only selectively in hospitals and private practices. Drawing on 61 interviews with obstetrician-gynecologists in these settings, we examine how they determine which abortions to perform. We find that they distinguish between more and less legitimate abortions, producing a narrative of stratified legitimacy that privileges abortions for intended pregnancies, when the fetus is unhealthy, and when women perform normative gendered sexuality, including distress about the abortion, guilt about failure to contracept, and desire for motherhood. This stratified legitimacy can perpetuate socially-inflected inequality of access and normative gendered sexuality. Additionally, we argue that the practice by physicians of distinguishing among abortions can legitimate legislative practices that regulate and restrict some kinds of abortion, further constraining abortion access. © American Sociological Association 2016.

  11. RADIAL STABILITY IN STRATIFIED STARS

    International Nuclear Information System (INIS)

    Pereira, Jonas P.; Rueda, Jorge A.

    2015-01-01

    We formulate within a generalized distributional approach the treatment of the stability against radial perturbations for both neutral and charged stratified stars in Newtonian and Einstein's gravity. We obtain from this approach the boundary conditions connecting any two phases within a star and underline its relevance for realistic models of compact stars with phase transitions, owing to the modification of the star's set of eigenmodes with respect to the continuous case

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  14. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  15. The development and validation of the Male Genital Self-Image Scale: results from a nationally representative probability sample of men in the United States.

    Science.gov (United States)

    Herbenick, Debby; Schick, Vanessa; Reece, Michael; Sanders, Stephanie A; Fortenberry, J Dennis

    2013-06-01

    Numerous factors may affect men's sexual experiences, including their health status, past trauma or abuse, medication use, relationships, mood, anxiety, and body image. Little research has assessed the influence of men's genital self-image on their sexual function or behaviors and none has done so in a nationally representative sample. The purpose of this study was to, in a nationally representative probability sample of men ages 18 to 60, assess the reliability and validity of the Male Genital Self-Image Scale (MGSIS), and to examine the relationship between scores on the MGSIS and men's scores on the International Index of Erectile Function (IIEF). The MGSIS was developed in two stages. Phase One involved a review of the literature and an analysis of cross-sectional survey data. Phase Two involved an administration of the scale items to a nationally representative sample of men in the United States ages 18 to 60. Measures include demographic items, the IIEF, and the MGSIS. Overall, most men felt positively about their genitals. However, 24.6% of men expressed some discomfort letting a healthcare provider examine their genitals and about 20% reported dissatisfaction with their genital size. The MGSIS was found to be reliable and valid, with the MGSIS-5 (consisting of five items) being the best fit to the data. The MGSIS was found to be a reliable and valid measure. In addition, men's scores on the MGSIS-5 were found to be positively related to men's scores on the IIEF. © 2013 International Society for Sexual Medicine.

  16. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  17. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  18. Disparities in adverse childhood experiences among sexual minority and heterosexual adults: results from a multi-state probability-based sample.

    Directory of Open Access Journals (Sweden)

    Judith P Andersen

    Full Text Available Adverse childhood experiences (e.g., physical, sexual and emotional abuse, neglect, exposure to domestic violence, parental discord, familial mental illness, incarceration and substance abuse constitute a major public health problem in the United States. The Adverse Childhood Experiences (ACE scale is a standardized measure that captures multiple developmental risk factors beyond sexual, physical and emotional abuse. Lesbian, gay, and bisexual (i.e., sexual minority individuals may experience disproportionately higher prevalence of adverse childhood experiences.To examine, using the ACE scale, prevalence of childhood physical, emotional, and sexual abuse and childhood household dysfunction among sexual minority and heterosexual adults.Analyses were conducted using a probability-based sample of data pooled from three U.S. states' Behavioral Risk Factor Surveillance System (BRFSS surveys (Maine, Washington, Wisconsin that administered the ACE scale and collected information on sexual identity (n = 22,071.Compared with heterosexual respondents, gay/lesbian and bisexual individuals experienced increased odds of six of eight and seven of eight adverse childhood experiences, respectively. Sexual minority persons had higher rates of adverse childhood experiences (IRR = 1.66 gay/lesbian; 1.58 bisexual compared to their heterosexual peers.Sexual minority individuals have increased exposure to multiple developmental risk factors beyond physical, sexual and emotional abuse. We recommend the use of the Adverse Childhood Experiences scale in future research examining health disparities among this minority population.

  19. Free Falling in Stratified Fluids

    Science.gov (United States)

    Lam, Try; Vincent, Lionel; Kanso, Eva

    2017-11-01

    Leaves falling in air and discs falling in water are examples of unsteady descents due to complex interaction between gravitational and aerodynamic forces. Understanding these descent modes is relevant to many branches of engineering and science such as estimating the behavior of re-entry space vehicles to studying biomechanics of seed dispersion. For regularly shaped objects falling in homogenous fluids, the motion is relatively well understood. However, less is known about how density stratification of the fluid medium affects the falling behavior. Here, we experimentally investigate the descent of discs in both pure water and in stable linearly stratified fluids for Froude numbers Fr 1 and Reynolds numbers Re between 1000 -2000. We found that stable stratification (1) enhances the radial dispersion of the disc at landing, (2) increases the descent time, (3) decreases the inclination (or nutation) angle, and (4) decreases the fluttering amplitude while falling. We conclude by commenting on how the corresponding information can be used as a predictive model for objects free falling in stratified fluids.

  20. Stratified Medicine and Reimbursement Issues

    Directory of Open Access Journals (Sweden)

    Hans-Joerg eFugel

    2012-10-01

    Full Text Available Stratified Medicine (SM has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to strengthen the value proposition to pricing and reimbursement (P&R authorities. However, the introduction of SM challenges current reimbursement schemes in many EU countries and the US as different P&R policies have been adopted for drugs and diagnostics. Also, there is a lack of a consistent process for value assessment of more complex diagnostics in these markets. New, innovative approaches and more flexible P&R systems are needed to reflect the added value of diagnostic tests and to stimulate investments in new technologies. Yet, the framework for access of diagnostic–based therapies still requires further development while setting the right incentives and appropriate align stakeholders interests when realizing long- term patient benefits. This article addresses the reimbursement challenges of SM approaches in several EU countries and the US outlining some options to overcome existing reimbursement barriers for stratified medicine.

  1. HIV-related risk behaviours and the correlates among rickshaw pullers of Kamrangirchar, Dhaka, Bangladesh: a cross-sectional study using probability sampling

    Directory of Open Access Journals (Sweden)

    Ravari Shahrzad

    2009-03-01

    Full Text Available Abstract Background National HIV serological and behavioural surveillance of Bangladesh repeatedly demonstrated a very high proportion of rickshaw pullers in Dhaka city, having sex with female sex workers (FSWs and using illicit substances. However, no study has been conducted to identify the correlates of having sex with FSWs among this population. This study aimed to describe behavioural profile of rickshaw pullers in Dhaka city using probability samples and to identify the correlates for having sex with FSWs in order to focus HIV prevention intervention. Methods Six hundred rickshaw pullers were randomly selected from rickshaw garages in the Kamrangirchar area, the single largest slum cluster of Dhaka, Bangladesh, during March–April 2008 using the Proportion Probability to Size method. Participants were interviewed, with a response rate of 99.2% (n = 595, using a structured questionnaire and asked about illicit substance use, sexual behaviour and risk perception for HIV and sexually transmitted diseases. Independent predictors of having sex with FSWs were analysed by multivariate analysis. A qualitative study was subsequently conducted with 30 rickshaw pullers to supplement the findings of the initial survey. Results The proportion of survey respondents who had sex with FSWs and those who used illicit substances in the previous 12 months period were 7.9% and 24.9%, respectively, much lower than the results achieved in the 2003–04 behavioural surveillance (72.8% and 89.9%, respectively. Multivariate analysis revealed the characteristics of younger age, being never married, living alone with family remaining in other districts and using illicit substances in the previous 12 months were significantly associated with having sex with FSWs. Conclusion HIV-related risk behaviour of our study population of the rickshaw pullers was lower than what has been suggested by the results of behavioural surveillance. While this discrepancy should be

  2. Information content of household-stratified epidemics

    Directory of Open Access Journals (Sweden)

    T.M. Kinyanjui

    2016-09-01

    Full Text Available Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.

  3. Information content of household-stratified epidemics.

    Science.gov (United States)

    Kinyanjui, T M; Pellis, L; House, T

    2016-09-01

    Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Suppression of stratified explosive interactions

    Energy Technology Data Exchange (ETDEWEB)

    Meeks, M.K.; Shamoun, B.I.; Bonazza, R.; Corradini, M.L. [Wisconsin Univ., Madison, WI (United States). Dept. of Nuclear Engineering and Engineering Physics

    1998-01-01

    Stratified Fuel-Coolant Interaction (FCI) experiments with Refrigerant-134a and water were performed in a large-scale system. Air was uniformly injected into the coolant pool to establish a pre-existing void which could suppress the explosion. Two competing effects due to the variation of the air flow rate seem to influence the intensity of the explosion in this geometrical configuration. At low flow rates, although the injected air increases the void fraction, the concurrent agitation and mixing increases the intensity of the interaction. At higher flow rates, the increase in void fraction tends to attenuate the propagated pressure wave generated by the explosion. Experimental results show a complete suppression of the vapor explosion at high rates of air injection, corresponding to an average void fraction of larger than 30%. (author)

  5. Substance use among women receiving post-rape medical care, associated post-assault concerns and current substance abuse: results from a national telephone household probability sample.

    Science.gov (United States)

    McCauley, Jenna L; Kilpatrick, Dean G; Walsh, Kate; Resnick, Heidi S

    2013-04-01

    To examine post-rape substance use, associated post rape medical and social concern variables, and past year substance abuse among women reporting having received medical care following a most recent or only lifetime incident of rape. Using a subsample of women who received post-rape medical care following a most recent or only rape incident (n=104) drawn from a national household probability sample of U.S. women, the current study described the extent of peritraumatic substance use, past year substance misuse behaviors, post-rape HIV and pregnancy concerns, and lifetime mental health service utilization as a function of substance use at time of incident. One-third (33%) of women seeking post-rape medical attention reported consuming alcohol or drugs at the time of their rape incident. Nearly one in four (24.7%) and one in seven (15%) women seeking medical attention following their most recent rape incident endorsed drug (marijuana, illicit, non-medical use of prescription drugs, or club drug) use or met substance abuse criteria, respectively, in the past year. One in twelve (8.4%) women reported at least monthly binge drinking in the past year. Approximately two-thirds of women reported seeking services for mental health needs in their lifetime. Post-rape concerns among women reporting peritraumatic substance use were not significantly different from those of women not reporting such use. Substance use was reported by approximately one-third of women and past year substance abuse was common among those seeking post-rape medical care. Implications for service delivery, intervention implementation, and future research are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. HLA-Cw Allele Frequency in Definite Meniere’s Disease Compared to Probable Meniere’s Disease and Healthy Controls in an Iranian Sample

    Directory of Open Access Journals (Sweden)

    Sasan Dabiri

    2016-05-01

    Full Text Available Introduction Several lines of evidence support the contribution of autoimmune mechanisms in the pathogenesis of Meniere’s disease. The aim of this study was determining the association between HLA-Cw Alleles in patients with definite Meniere’s disease and patients with probable Meniere’s disease and a control group.  Materials and Methods: HLA-Cw genotyping was performed in 23 patients with definite Meniere’s disease, 24 with probable Meniere’s disease, and 91 healthy normal subjects, using sequence specific primers polymerase chain reaction technique. The statistical analysis was performed using stata 8 software.  Results: There was a significant association between HLA-Cw*04 and HLA-Cw*16 in both definite and probable Meniere’s disease compared to normal healthy controls. We observed a significant difference in HLA-Cw*12 frequencies between patients with definite Meniere’s disease compared to patients with probable Meniere’s disease (P=0.04. The frequency of HLA-Cw*18 is significantly higher in healthy controls (P=0.002.  Conclusion: Our findings support the rule of HLA-Cw Alleles in both definite and probable Meniere’s disease. In addition, differences in HLA-Cw*12 frequency in definite and probable Meniere’s disease in our study’s population might indicate distinct immune and inflammatory mechanisms involved in each condition.

  7. Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey

    Science.gov (United States)

    Burkill, Sarah; Couper, Mick P; Conrad, Frederick; Clifton, Soazig; Tanton, Clare; Phelps, Andrew; Datta, Jessica; Mercer, Catherine H; Sonnenberg, Pam; Prah, Philip; Mitchell, Kirstin R; Wellings, Kaye; Johnson, Anne M; Copas, Andrew J

    2014-01-01

    Background Nonprobability Web surveys using volunteer panels can provide a relatively cheap and quick alternative to traditional health and epidemiological surveys. However, concerns have been raised about their representativeness. Objective The aim was to compare results from different Web panels with a population-based probability sample survey (n=8969 aged 18-44 years) that used computer-assisted self-interview (CASI) for sensitive behaviors, the third British National Survey of Sexual Attitudes and Lifestyles (Natsal-3). Methods Natsal-3 questions were included on 4 nonprobability Web panel surveys (n=2000 to 2099), 2 using basic quotas based on age and sex, and 2 using modified quotas based on additional variables related to key estimates. Results for sociodemographic characteristics were compared with external benchmarks and for sexual behaviors and opinions with Natsal-3. Odds ratios (ORs) were used to express differences between the benchmark data and each survey for each variable of interest. A summary measure of survey performance was the average absolute OR across variables. Another summary measure was the number of key estimates for which the survey differed significantly (at the 5% level) from the benchmarks. Results For sociodemographic variables, the Web surveys were less representative of the general population than Natsal-3. For example, for men, the average absolute OR for Natsal-3 was 1.14, whereas for the Web surveys the average absolute ORs ranged from 1.86 to 2.30. For all Web surveys, approximately two-thirds of the key estimates of sexual behaviors were different from Natsal-3 and the average absolute ORs ranged from 1.32 to 1.98. Differences were appreciable even for questions asked by CASI in Natsal-3. No single Web survey performed consistently better than any other did. Modified quotas slightly improved results for men, but not for women. Conclusions Consistent with studies from other countries on less sensitive topics, volunteer Web

  8. PHOTOSPHERIC EMISSION FROM STRATIFIED JETS

    International Nuclear Information System (INIS)

    Ito, Hirotaka; Nagataki, Shigehiro; Ono, Masaomi; Lee, Shiu-Hang; Mao, Jirong; Yamada, Shoichi; Pe'er, Asaf; Mizuta, Akira; Harikae, Seiji

    2013-01-01

    We explore photospheric emissions from stratified two-component jets, wherein a highly relativistic spine outflow is surrounded by a wider and less relativistic sheath outflow. Thermal photons are injected in regions of high optical depth and propagated until the photons escape at the photosphere. Because of the presence of shear in velocity (Lorentz factor) at the boundary of the spine and sheath region, a fraction of the injected photons are accelerated using a Fermi-like acceleration mechanism such that a high-energy power-law tail is formed in the resultant spectrum. We show, in particular, that if a velocity shear with a considerable variance in the bulk Lorentz factor is present, the high-energy part of observed gamma-ray bursts (GRBs) photon spectrum can be explained by this photon acceleration mechanism. We also show that the accelerated photons might also account for the origin of the extra-hard power-law component above the bump of the thermal-like peak seen in some peculiar bursts (e.g., GRB 090510, 090902B, 090926A). We demonstrate that time-integrated spectra can also reproduce the low-energy spectrum of GRBs consistently using a multi-temperature effect when time evolution of the outflow is considered. Last, we show that the empirical E p -L p relation can be explained by differences in the outflow properties of individual sources

  9. Prevalence of masturbation and associated factors in a British national probability survey

    OpenAIRE

    Gerressu, Makeda; Mercer, Catherine H.; Graham, Cynthia A.; Wellings, Kaye; Johnson, Anne M.

    2008-01-01

    This is the post-print version of the article. The official published version can be found at the link below. A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with repo...

  10. Designing Wood Supply Scenarios from Forest Inventories with Stratified Predictions

    Directory of Open Access Journals (Sweden)

    Philipp Kilham

    2018-02-01

    Full Text Available Forest growth and wood supply projections are increasingly used to estimate the future availability of woody biomass and the correlated effects on forests and climate. This research parameterizes an inventory-based business-as-usual wood supply scenario, with a focus on southwest Germany and the period 2002–2012 with a stratified prediction. First, the Classification and Regression Trees algorithm groups the inventory plots into strata with corresponding harvest probabilities. Second, Random Forest algorithms generate individual harvest probabilities for the plots of each stratum. Third, the plots with the highest individual probabilities are selected as harvested until the harvest probability of the stratum is fulfilled. Fourth, the harvested volume of these plots is predicted with a linear regression model trained on harvested plots only. To illustrate the pros and cons of this method, it is compared to a direct harvested volume prediction with linear regression, and a combination of logistic regression and linear regression. Direct harvested volume regression predicts comparable volume figures, but generates these volumes in a way that differs from business-as-usual. The logistic model achieves higher overall classification accuracies, but results in underestimations or overestimations of harvest shares for several subsets of the data. The stratified prediction method balances this shortcoming, and can be of general use for forest growth and timber supply projections from large-scale forest inventories.

  11. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  12. Classification of archaeologically stratified pumice by INAA

    International Nuclear Information System (INIS)

    Peltz, C.; Bichler, M.

    2001-01-01

    In the framework of the research program 'Synchronization of Civilization in the Eastern Mediterranean Region in the 2nd Millenium B.C.' instrumental neutron activation analysis (INAA) was used to determine 30 elements in pumice from archaeological excavations to reveal their specific volcanic origin. The widespread pumiceous products of several eruptions in the Aegean region were used as abrasive tools and were therefore popular trade objects. A remarkable quantity of pumice and pumiceous tephra (several km 3 ) was produced by the 'Minoan eruption' of Thera (Santorini), which is assumed to have happened between 1450 and 1650 B.C. Thus the discovery of the primary fallout of 'Minoan' tephra in archaeologically stratified locations can be used as a relative time mark. Additionally, pumice lumps used as abrasive can serve for dating by first appearance. Essential to an identification of the primary volcanic source is the knowledge that pumices from the Aegean region can easily be distinguished by their trace element distribution patterns, as previous work has shown. The elements Al, Ba, Ca, Ce, Co, Cr, Cs, Dy, Eu, Fe, Hf, K, La, Lu, Mn, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, Ti, U, V, Yb, Zn and Zr were determined in 16 samples of pumice lumps from excavations in Tell-el-Dab'a and Tell-el-Herr (Egypt). Two irradiation cycles and five measurement runs were applied. A reliable identification of the samples is achieved by comparing these results to the database compiled in previous studies. (author)

  13. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  14. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  15. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  16. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  17. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  18. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  19. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  20. Prototypic Features of Loneliness in a Stratified Sample of Adolescents

    DEFF Research Database (Denmark)

    Lasgaard, Mathias; Elklit, Ask

    2009-01-01

    Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics...... guidance and intervention. Thus, professionals need to be knowledgeable about prototypic features of loneliness in addition to employing a pro-active approach when assisting adolescents who display prototypic features....

  1. Table of sample sizes needed to detect at least one defective with 100(1-α)% probability (α = 0.01, 0.05)

    International Nuclear Information System (INIS)

    Stewart, K.B.

    1972-01-01

    Tables are presented which give the random sample size needed in order to be 95 percent(99 percent) certain of detecting at least one defective item when there are k defective items in a population of n items. The application of the tables to certain safeguards problems is discussed. The range of the tables is as follows: r = 0(1)25, n = r(1)r + 999. (U.S.)

  2. Support for targeted sampling of red fox (Vulpes vulpes) feces in Sweden: a method to improve the probability of finding Echinococcus multilocularis.

    Science.gov (United States)

    Miller, Andrea L; Olsson, Gert E; Sollenberg, Sofia; Skarin, Moa; Wahlström, Helene; Höglund, Johan

    2016-11-29

    Localized concentrations of Echinococcus multilocularis eggs from feces of infected red fox (Vulpes vulpes) can create areas of higher transmission risk for rodent hosts and possibly also for humans; therefore, identification of these areas is important. However, in a low prevalence environment, such as Sweden, these areas could be easily overlooked. As part of a project investigating the role of different rodents in the epidemiology of E. multilocularis in Sweden, fox feces were collected seasonally from rodent trapping sites in two regions with known parasite status and in two regions with unknown parasite status, 2013-2015. The aim was to evaluate background contamination in rodent trapping sites from parasite eggs in these regions. To maximize the likelihood of finding fox feces positive for the parasite, fecal collection was focused in habitats with the assumed presence of suitable rodent intermediate hosts (i.e. targeted sampling). Parasite eggs were isolated from feces through sieving-flotation, and parasite species were then confirmed using PCR and sequencing. Most samples were collected in the late winter/early spring and in open fields where both Arvicola amphibius and Microtus agrestis were captured. Fox feces positive for E. multilocularis (41/714) were found within 1-3 field collection sites within each of the four regions. The overall proportion of positive samples was low (≤5.4%) in three regions, but was significantly higher in one region (22.5%, P < 0.001). There was not a significant difference between seasons or years. Compared to previous national screenings, our sampling strategy identified multiple E. multilocularis positive feces in all four regions, including the two regions with previously unknown parasite status. These results further suggest that the distribution of E. multilocularis is highly aggregated in the environment and provide support for further development of a targeted sampling strategy. Our results show that it was

  3. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  4. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  5. Grain distinct stratified nanolayers in aluminium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Donatus, U., E-mail: uyimedonatus@yahoo.com [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Thompson, G.E.; Zhou, X.; Alias, J. [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Tsai, I.-L. [Oxford Instruments NanoAnalysis, HP12 2SE, High Wycombe (United Kingdom)

    2017-02-15

    The grains of aluminium alloys have stratified nanolayers which determine their mechanical and chemical responses. In this study, the nanolayers were revealed in the grains of AA6082 (T6 and T7 conditions), AA5083-O and AA2024-T3 alloys by etching the alloys in a solution comprising 20 g Cr{sub 2}O{sub 3} + 30 ml HPO{sub 3} in 1 L H{sub 2}O. Microstructural examination was conducted on selected grains of interest using scanning electron microscopy and electron backscatter diffraction technique. It was observed that the nanolayers are orientation dependent and are parallel to the {100} planes. They have ordered and repeated tunnel squares that are flawed at the sides which are aligned in the <100> directions. These flawed tunnel squares dictate the tunnelling corrosion morphology as well as appearing to have an affect on the arrangement and sizes of the precipitation hardening particles. The inclination of the stratified nanolayers, their interpacing, and the groove sizes have significant influence on the corrosion behaviour and seeming influence on the strengthening mechanism of the investigated aluminium alloys. - Highlights: • Stratified nanolayers in aluminium alloy grains. • Relationship of the stratified nanolayers with grain orientation. • Influence of the inclinations of the stratified nanolayers on corrosion. • Influence of the nanolayers interspacing and groove sizes on hardness and corrosion.

  6. Exploring optimum cut-off scores to screen for probable posttraumatic stress disorder within a sample of UK treatment-seeking veterans

    Science.gov (United States)

    Murphy, Dominic; Ross, Jana; Ashwick, Rachel; Armour, Cherie; Busuttil, Walter

    2017-01-01

    ABSTRACT Background: Previous research exploring the psychometric properties of the scores of measures of posttraumatic stress disorder (PTSD) suggests there is variation in their functioning depending on the target population. To date, there has been little study of these properties within UK veteran populations. Objective: This study aimed to determine optimally efficient cut-off values for the Impact of Event Scale-Revised (IES-R) and the PTSD Checklist for DSM-5 (PCL-5) that can be used to assess for differential diagnosis of presumptive PTSD. Methods: Data from a sample of 242 UK veterans assessed for mental health difficulties were analysed. The criterion-related validity of the PCL-5 and IES-R were evaluated against the Clinician-Administered PTSD Scale for DSM-5 (CAPS-5). Kappa statistics were used to assess the level of agreement between the DSM-IV and DSM-5 classification systems. Results: The optimal cut-off scores observed within this sample were 34 or above on the PCL-5 and 46 or above on the IES-R. The PCL-5 cut-off is similar to the previously reported values, but the IES-R cut-off identified in this study is higher than has previously been recommended. Overall, a moderate level of agreement was found between participants screened positive using the DSM-IV and DSM-5 classification systems of PTSD. Conclusions: Our findings suggest that the PCL-5 and IES-R can be used as brief measures within veteran populations presenting at secondary care to assess for PTSD. The use of a higher cut-off for the IES-R may be helpful for differentiating between veterans who present with PTSD and those who may have some sy`mptoms of PTSD but are sub-threshold for meeting a diagnosis. Further, the use of more accurate optimal cut-offs may aid clinicians to better monitor changes in PTSD symptoms during and after treatment. PMID:29435200

  7. Exploring optimum cut-off scores to screen for probable posttraumatic stress disorder within a sample of UK treatment-seeking veterans.

    Science.gov (United States)

    Murphy, Dominic; Ross, Jana; Ashwick, Rachel; Armour, Cherie; Busuttil, Walter

    2017-01-01

    Background : Previous research exploring the psychometric properties of the scores of measures of posttraumatic stress disorder (PTSD) suggests there is variation in their functioning depending on the target population. To date, there has been little study of these properties within UK veteran populations. Objective : This study aimed to determine optimally efficient cut-off values for the Impact of Event Scale-Revised (IES-R) and the PTSD Checklist for DSM-5 (PCL-5) that can be used to assess for differential diagnosis of presumptive PTSD. Methods : Data from a sample of 242 UK veterans assessed for mental health difficulties were analysed. The criterion-related validity of the PCL-5 and IES-R were evaluated against the Clinician-Administered PTSD Scale for DSM-5 (CAPS-5). Kappa statistics were used to assess the level of agreement between the DSM-IV and DSM-5 classification systems. Results : The optimal cut-off scores observed within this sample were 34 or above on the PCL-5 and 46 or above on the IES-R. The PCL-5 cut-off is similar to the previously reported values, but the IES-R cut-off identified in this study is higher than has previously been recommended. Overall, a moderate level of agreement was found between participants screened positive using the DSM-IV and DSM-5 classification systems of PTSD. Conclusions : Our findings suggest that the PCL-5 and IES-R can be used as brief measures within veteran populations presenting at secondary care to assess for PTSD. The use of a higher cut-off for the IES-R may be helpful for differentiating between veterans who present with PTSD and those who may have some sy`mptoms of PTSD but are sub-threshold for meeting a diagnosis. Further, the use of more accurate optimal cut-offs may aid clinicians to better monitor changes in PTSD symptoms during and after treatment.

  8. Stratified charge rotary engine for general aviation

    Science.gov (United States)

    Mount, R. E.; Parente, A. M.; Hady, W. F.

    1986-01-01

    A development history, a current development status assessment, and a design feature and performance capabilities account are given for stratified-charge rotary engines applicable to aircraft propulsion. Such engines are capable of operating on Jet-A fuel with substantial cost savings, improved altitude capability, and lower fuel consumption by comparison with gas turbine powerplants. Attention is given to the current development program of a 400-hp engine scheduled for initial operations in early 1990. Stratified charge rotary engines are also applicable to ground power units, airborne APUs, shipboard generators, and vehicular engines.

  9. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  10. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Risgaard-Petersen, N.; Schramm, A.

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...

  11. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  12. The Fokker-Planck equation for ray dispersion in gyrotropic stratified media

    NARCIS (Netherlands)

    Golynski, S.M.

    1984-01-01

    The Hamilton equations of geometrical optics determine the rays of the relevant wave field in the short wavelength. We give a systematic derivation of the Fokker-Planck equation for the joint probability density of the position and unit direction vector of rays propagating in a gyrotropic stratified

  13. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  14. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  15. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  16. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  17. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  18. The Risk-Stratified Osteoporosis Strategy Evaluation study (ROSE)

    DEFF Research Database (Denmark)

    Rubin, Katrine Hass; Holmberg, Teresa; Rothmann, Mette Juel

    2015-01-01

    The risk-stratified osteoporosis strategy evaluation study (ROSE) is a randomized prospective population-based study investigating the effectiveness of a two-step screening program for osteoporosis in women. This paper reports the study design and baseline characteristics of the study population....... 35,000 women aged 65-80 years were selected at random from the population in the Region of Southern Denmark and-before inclusion-randomized to either a screening group or a control group. As first step, a self-administered questionnaire regarding risk factors for osteoporosis based on FRAX......(®) was issued to both groups. As second step, subjects in the screening group with a 10-year probability of major osteoporotic fractures ≥15 % were offered a DXA scan. Patients diagnosed with osteoporosis from the DXA scan were advised to see their GP and discuss pharmaceutical treatment according to Danish...

  19. MC3D modelling of stratified explosion

    International Nuclear Information System (INIS)

    Picchi, S.; Berthoud, G.

    1999-01-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  20. MC3D modelling of stratified explosion

    Energy Technology Data Exchange (ETDEWEB)

    Picchi, S.; Berthoud, G. [DTP/SMTH/LM2, CEA, 38 - Grenoble (France)

    1999-07-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  1. Equipment for extracting and conveying stratified minerals

    Energy Technology Data Exchange (ETDEWEB)

    Blumenthal, G.; Kunzer, H.; Plaga, K.

    1991-08-14

    This invention relates to equipment for extracting stratified minerals and conveying the said minerals along the working face, comprising a trough shaped conveyor run assembled from lengths, a troughed extraction run in lengths matching the lengths of conveyor troughing, which is linked to the top edge of the working face side of the conveyor troughing with freedom to swivel vertically, and a positively guided chain carrying extraction tools and scrapers along the conveyor and extraction runs.

  2. Inviscid incompressible limits of strongly stratified fluids

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Jin, B.J.; Novotný, A.

    2014-01-01

    Roč. 89, 3-4 (2014), s. 307-329 ISSN 0921-7134 R&D Projects: GA ČR GA201/09/0917 Institutional support: RVO:67985840 Keywords : compressible Navier-Stokes system * anelastic approximation * stratified fluid Subject RIV: BA - General Mathematics Impact factor: 0.528, year: 2014 http://iospress.metapress.com/content/d71255745tl50125/?p=969b60ae82634854ab8bd25505ce1f71&pi=3

  3. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, Niels Peter; Risgaard-Petersen, N.; Schramm, Andreas

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...... performing dissimilatory reduction of nitrate to ammonium have given new dimensions to the understanding of nitrogen cycling in nature, and the occurrence of these organisms and processes in stratified microbial communities will be described in detail.......Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about...... nitrogen fixation, nitrification, denitrification, and dissimilatory reduction of nitrate to ammonium, and about the microorganisms performing the processes, has been produced by use of these techniques. During the last decade the discovery of anammmox bacteria and migrating, nitrate accumulating bacteria...

  4. Large eddy simulation of stably stratified turbulence

    International Nuclear Information System (INIS)

    Shen Zhi; Zhang Zhaoshun; Cui Guixiang; Xu Chunxiao

    2011-01-01

    Stably stratified turbulence is a common phenomenon in atmosphere and ocean. In this paper the large eddy simulation is utilized for investigating homogeneous stably stratified turbulence numerically at Reynolds number Re = uL/v = 10 2 ∼10 3 and Froude number Fr = u/NL = 10 −2 ∼10 0 in which u is root mean square of velocity fluctuations, L is integral scale and N is Brunt-Vaïsälä frequency. Three sets of computation cases are designed with different initial conditions, namely isotropic turbulence, Taylor Green vortex and internal waves, to investigate the statistical properties from different origins. The computed horizontal and vertical energy spectra are consistent with observation in atmosphere and ocean when the composite parameter ReFr 2 is greater than O(1). It has also been found in this paper that the stratification turbulence can be developed under different initial velocity conditions and the internal wave energy is dominated in the developed stably stratified turbulence.

  5. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  6. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  7. [Causes of emergency dizziness stratified by etiology].

    Science.gov (United States)

    Qiao, Wenying; Liu, Jianguo; Zeng, Hong; Liu, Yugeng; Jia, Weihua; Wang, Honghong; Liu, Bo; Tan, Jing; Li, Changqing

    2014-06-03

    To explore the causes of emergency dizziness stratified to improve the diagnostic efficiency. A total of 1 857 cases of dizziness at our emergency department were collected and their etiologies stratified by age and gender. The top three diagnoses were benign paroxysmal positional vertigo (BPPV, 31.7%), hypertension (24.0%) and posterior circulation ischemia (PCI, 20.5%). Stratified by age, the main causes of dizziness included BPPV (n = 6), migraine-associated vertigo (n = 2), unknown cause (n = 1) for the group of vertigo (14.5%) and neurosis (7.3%) for 18-44 years; BPPV (36.8%), hypertension (22.4%) and migraine-associated vertigo (11.2%) for 45-59 years; hypertension (30.8%), PCI (29.8%) and BPPV (22.9%) for 60-74 years; PCI (30.7%), hypertension (28.6%) and BPPV (25.5%) for 75-92 years. BPPV, migraine and neurosis were more common in females while hypertension and PCI predominated in males (all P hypertension, neurosis and migraine showed the following significant demographic features: BPPV, PCI, hypertension, neurosis and migraine may be the main causes of dizziness. BPPV should be considered initially when vertigo was triggered repeatedly by positional change, especially for young and middle-aged women. And the other common causes of dizziness were migraine-associated vertigo, neurosis and Meniere's disease.Hypertension should be screened firstly in middle-aged and elderly patients presenting mainly with head heaviness and stretching. In elders with dizziness, BPPV is second in constituent ratio to PCI and hypertension.In middle-aged and elderly patients with dizziness, psychological factors should be considered and diagnosis and treatment should be offered timely.

  8. White dwarf stars with chemically stratified atmospheres

    Science.gov (United States)

    Muchmore, D.

    1982-01-01

    Recent observations and theory suggest that some white dwarfs may have chemically stratified atmospheres - thin layers of hydrogen lying above helium-rich envelopes. Models of such atmospheres show that a discontinuous temperature inversion can occur at the boundary between the layers. Model spectra for layered atmospheres at 30,000 K and 50,000 K tend to have smaller decrements at 912 A, 504 A, and 228 A than uniform atmospheres would have. On the basis of their continuous extreme ultraviolet spectra, it is possible to distinguish observationally between uniform and layered atmospheres for hot white dwarfs.

  9. Stratified B-trees and versioning dictionaries

    OpenAIRE

    Twigg, Andy; Byde, Andrew; Milos, Grzegorz; Moreton, Tim; Wilkes, John; Wilkie, Tom

    2011-01-01

    A classic versioned data structure in storage and computer science is the copy-on-write (CoW) B-tree -- it underlies many of today's file systems and databases, including WAFL, ZFS, Btrfs and more. Unfortunately, it doesn't inherit the B-tree's optimality properties; it has poor space utilization, cannot offer fast updates, and relies on random IO to scale. Yet, nothing better has been developed since. We describe the `stratified B-tree', which beats all known semi-external memory versioned B...

  10. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Soil mixing of stratified contaminated sands.

    Science.gov (United States)

    Al-Tabba, A; Ayotamuno, M J; Martin, R J

    2000-02-01

    Validation of soil mixing for the treatment of contaminated ground is needed in a wide range of site conditions to widen the application of the technology and to understand the mechanisms involved. Since very limited work has been carried out in heterogeneous ground conditions, this paper investigates the effectiveness of soil mixing in stratified sands using laboratory-scale augers. This enabled a low cost investigation of factors such as grout type and form, auger design, installation procedure, mixing mode, curing period, thickness of soil layers and natural moisture content on the unconfined compressive strength, leachability and leachate pH of the soil-grout mixes. The results showed that the auger design plays a very important part in the mixing process in heterogeneous sands. The variability of the properties measured in the stratified soils and the measurable variations caused by the various factors considered, highlighted the importance of duplicating appropriate in situ conditions, the usefulness of laboratory-scale modelling of in situ conditions and the importance of modelling soil and contaminant heterogeneities at the treatability study stage.

  13. Stratified coastal ocean interactions with tropical cyclones

    Science.gov (United States)

    Glenn, S. M.; Miles, T. N.; Seroka, G. N.; Xu, Y.; Forney, R. K.; Yu, F.; Roarty, H.; Schofield, O.; Kohut, J.

    2016-01-01

    Hurricane-intensity forecast improvements currently lag the progress achieved for hurricane tracks. Integrated ocean observations and simulations during hurricane Irene (2011) reveal that the wind-forced two-layer circulation of the stratified coastal ocean, and resultant shear-induced mixing, led to significant and rapid ahead-of-eye-centre cooling (at least 6 °C and up to 11 °C) over a wide swath of the continental shelf. Atmospheric simulations establish this cooling as the missing contribution required to reproduce Irene's accelerated intensity reduction. Historical buoys from 1985 to 2015 show that ahead-of-eye-centre cooling occurred beneath all 11 tropical cyclones that traversed the Mid-Atlantic Bight continental shelf during stratified summer conditions. A Yellow Sea buoy similarly revealed significant and rapid ahead-of-eye-centre cooling during Typhoon Muifa (2011). These findings establish that including realistic coastal baroclinic processes in forecasts of storm intensity and impacts will be increasingly critical to mid-latitude population centres as sea levels rise and tropical cyclone maximum intensities migrate poleward. PMID:26953963

  14. Stratified Simulations of Collisionless Accretion Disks

    Energy Technology Data Exchange (ETDEWEB)

    Hirabayashi, Kota; Hoshino, Masahiro, E-mail: hirabayashi-k@eps.s.u-tokyo.ac.jp [Department of Earth and Planetary Science, The University of Tokyo, Tokyo, 113-0033 (Japan)

    2017-06-10

    This paper presents a series of stratified-shearing-box simulations of collisionless accretion disks in the recently developed framework of kinetic magnetohydrodynamics (MHD), which can handle finite non-gyrotropy of a pressure tensor. Although a fully kinetic simulation predicted a more efficient angular-momentum transport in collisionless disks than in the standard MHD regime, the enhanced transport has not been observed in past kinetic-MHD approaches to gyrotropic pressure anisotropy. For the purpose of investigating this missing link between the fully kinetic and MHD treatments, this paper explores the role of non-gyrotropic pressure and makes the first attempt to incorporate certain collisionless effects into disk-scale, stratified disk simulations. When the timescale of gyrotropization was longer than, or comparable to, the disk-rotation frequency of the orbit, we found that the finite non-gyrotropy selectively remaining in the vicinity of current sheets contributes to suppressing magnetic reconnection in the shearing-box system. This leads to increases both in the saturated amplitude of the MHD turbulence driven by magnetorotational instabilities and in the resultant efficiency of angular-momentum transport. Our results seem to favor the fast advection of magnetic fields toward the rotation axis of a central object, which is required to launch an ultra-relativistic jet from a black hole accretion system in, for example, a magnetically arrested disk state.

  15. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  16. Health-related quality of life predictors during medical residency in a random, stratified sample of residents Preditores de qualidade de vida relacionada à saúde durante a residência médica em uma amostra randomizada e estratificada de médicos residentes

    Directory of Open Access Journals (Sweden)

    Paula Costa Mosca Macedo

    2009-06-01

    Full Text Available OBJECTIVE: To evaluate the quality of life during the first three years of training and identify its association with sociodemographicoccupational characteristics, leisure time and health habits. METHOD: A cross-sectional study with a random sample of 128 residents stratified by year of training was conducted. The Medical Outcome Study -short form 36 was administered. Mann-Whitney tests were carried out to compare percentile distributions of the eight quality of life domains, according to sociodemographic variables, and a multiple linear regression analysis was performed, followed by a validity checking for the resulting models. RESULTS: The physical component presented higher quality of life medians than the mental component. Comparisons between the three years showed that in almost all domains the quality of life scores of the second year residents were higher than the first year residents (p OBJETIVO: Avaliar a qualidade de vida do médico residente durante os três anos do treinamento e identificar sua associação com as características sociodemográficas-ocupacionais, tempo de lazer e hábitos de saúde. MÉTODO: Foi realizado um estudo transversal com amostra randomizada de 128 residentes, estratificada por ano de residência. O Medical Outcome Study-Short Form 36 foi aplicado; as distribuições percentis dos domínios de qualidade de vida de acordo com variáveis sociodemográficas foram analisadas pelo teste de Mann-Whitney e regressão linear múltipla, bem como estudo de validação pós-regressão. RESULTADOS: O componente físico da qualidade de vida apresentou medianas mais altas do que o mental. Comparações entre os três anos mostraram que quase todos os domínios de qualidade de vida tiveram escores maiores no segundo do que no primeiro ano (p < 0,01; em relação ao componente mental observamos maiores escores no terceiro ano do que nos demais (p < 0,01. Preditores de maior qualidade de vida foram: estar no segundo ou

  17. Ecosystem metabolism in a stratified lake

    DEFF Research Database (Denmark)

    Stæhr, Peter Anton; Christensen, Jesper Philip Aagaard; Batt, Ryan D.

    2012-01-01

    , differences were not significant. During stratification, daily variability in epilimnetic DO was dominated by metabolism (46%) and air-water gas exchange (44%). Fluxes related to mixed-layer deepening dominated in meta- and hypolimnic waters (49% and 64%), while eddy diffusion (1% and 14%) was less important....... Although air-water gas exchange rates differed among the three formulations of gas-transfer velocity, this had no significant effect on metabolic rates....... that integrates rates across the entire depth profile and includes DO exchange between depth layers driven by mixed-layer deepening and eddy diffusivity. During full mixing, NEP was close to zero throughout the water column, and GPP and R were reduced 2-10 times compared to stratified periods. When present...

  18. Stratified growth in Pseudomonas aeruginosa biofilms

    DEFF Research Database (Denmark)

    Werner, E.; Roe, F.; Bugnicourt, A.

    2004-01-01

    In this study, stratified patterns of protein synthesis and growth were demonstrated in Pseudomonas aeruginosa biofilms. Spatial patterns of protein synthetic activity inside biofilms were characterized by the use of two green fluorescent protein (GFP) reporter gene constructs. One construct...... synthesis was restricted to a narrow band in the part of the biofilm adjacent to the source of oxygen. The zone of active GFP expression was approximately 60 Am wide in colony biofilms and 30 Am wide in flow cell biofilms. The region of the biofilm in which cells were capable of elongation was mapped...... by treating colony biofilms with carbenicillin, which blocks cell division, and then measuring individual cell lengths by transmission electron microscopy. Cell elongation was localized at the air interface of the biofilm. The heterogeneous anabolic patterns measured inside these biofilms were likely a result...

  19. Thermal instability in a stratified plasma

    International Nuclear Information System (INIS)

    Hermanns, D.F.M.; Priest, E.R.

    1989-01-01

    The thermal instability mechansism has been studied in connection to observed coronal features, like, e.g. prominences or cool cores in loops. Although these features show a lot of structure, most studies concern the thermal instability in an uniform medium. In this paper, we investigate the thermal instability and the interaction between thermal modes and the slow magneto-acoustic subspectrum for a stratified plasma slab. We fomulate the relevant system of equations and give some straightforward properties of the linear spectrum of a non-uniform plasma slab, i.e. the existence of continuous parts in the spectrum. We present a numerical scheme with which we can investigate the linear spectrum for equilibrium states with stratification. The slow and thermal subspectra of a crude coronal model are given as a preliminary result. (author). 6 refs.; 1 fig

  20. Stratified charge rotary engine combustion studies

    Science.gov (United States)

    Shock, H.; Hamady, F.; Somerton, C.; Stuecken, T.; Chouinard, E.; Rachal, T.; Kosterman, J.; Lambeth, M.; Olbrich, C.

    1989-07-01

    Analytical and experimental studies of the combustion process in a stratified charge rotary engine (SCRE) continue to be the subject of active research in recent years. Specifically to meet the demand for more sophisticated products, a detailed understanding of the engine system of interest is warranted. With this in mind the objective of this work is to develop an understanding of the controlling factors that affect the SCRE combustion process so that an efficient power dense rotary engine can be designed. The influence of the induction-exhaust systems and the rotor geometry are believed to have a significant effect on combustion chamber flow characteristics. In this report, emphasis is centered on Laser Doppler Velocimetry (LDV) measurements and on qualitative flow visualizations in the combustion chamber of the motored rotary engine assembly. This will provide a basic understanding of the flow process in the RCE and serve as a data base for verification of numerical simulations. Understanding fuel injection provisions is also important to the successful operation of the stratified charge rotary engine. Toward this end, flow visualizations depicting the development of high speed, high pressure fuel jets are described. Friction is an important consideration in an engine from the standpoint of lost work, durability and reliability. MSU Engine Research Laboratory efforts in accessing the frictional losses associated with the rotary engine are described. This includes work which describes losses in bearing, seal and auxillary components. Finally, a computer controlled mapping system under development is described. This system can be used to map shapes such as combustion chamber, intake manifolds or turbine blades accurately.

  1. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  2. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  3. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  4. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  5. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  6. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  7. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  8. Correlates of reasons for not reporting rape to police: results from a national telephone household probability sample of women with forcible or drug-or-alcohol facilitated/incapacitated rape.

    Science.gov (United States)

    Cohn, Amy M; Zinzow, Heidi M; Resnick, Heidi S; Kilpatrick, Dean G

    2013-02-01

    Rape tactics, rape incident characteristics, and mental health problems (lifetime depression, PTSD, and substance abuse) were investigated as correlates of eight different reasons for not reporting a rape to police among women who had experienced but did not report a rape to police (n = 441) within a national telephone household probability sample. Rape tactics (nonmutually exclusive) included drug or alcohol-facilitated or incapacitated rape (DAFR/IR; n = 119) and forcible rape (FR; n = 376). Principal Components Analysis (PCA) was conducted to extract a dominant set of patterns among the eight reasons for not reporting, and to reduce the set of dependent variables. PCA results indicated three unique factors: Not Wanting Others to Know, Nonacknowledgment of Rape, and Criminal Justice Concerns. Hierarchical regression analyses showed DAFR/IR and FR were both positively and significantly associated with Criminal Justice Concerns, whereas DAFR/IR, but not FR, was associated with Nonacknowledgment as a reason for not reporting to police. Neither DAFR/IR nor FR emerged as significant predictors of Others Knowing after controlling for fear of death or injury at the time of the incident. Correlations among variables showed that the Criminal Justice Concerns factor was positively related to lifetime depression and PTSD and the Nonacknowledgement factor was negatively related to lifetime PTSD. Findings suggest prevention programs should educate women about the definition of rape, which may include incapacitation due to alcohol or drugs, to increase acknowledgement and decrease barriers to police reporting.

  9. The optimism trap: Migrants' educational choices in stratified education systems.

    Science.gov (United States)

    Tjaden, Jasper Dag; Hunkler, Christian

    2017-09-01

    Immigrant children's ambitious educational choices have often been linked to their families' high level of optimism and motivation for upward mobility. However, previous research has mostly neglected alternative explanations such as information asymmetries or anticipated discrimination. Moreover, immigrant children's higher dropout rates at the higher secondary and university level suggest that low performing migrant students could have benefitted more from pursuing less ambitious tracks, especially in countries that offer viable vocational alternatives. We examine ethnic minority's educational choices using a sample of academically low performing, lower secondary school students in Germany's highly stratified education system. We find that their families' optimism diverts migrant students from viable vocational alternatives. Information asymmetries and anticipated discrimination do not explain their high educational ambitions. While our findings further support the immigrant optimism hypothesis, we discuss how its effect may have different implications depending on the education system. Copyright © 2017. Published by Elsevier Inc.

  10. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  11. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  12. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  13. The effect of surfactant on stratified and stratifying gas-liquid flows

    Science.gov (United States)

    Heiles, Baptiste; Zadrazil, Ivan; Matar, Omar

    2013-11-01

    We consider the dynamics of a stratified/stratifying gas-liquid flow in horizontal tubes. This flow regime is characterised by the thin liquid films that drain under gravity along the pipe interior, forming a pool at the bottom of the tube, and the formation of large-amplitude waves at the gas-liquid interface. This regime is also accompanied by the detachment of droplets from the interface and their entrainment into the gas phase. We carry out an experimental study involving axial- and radial-view photography of the flow, in the presence and absence of surfactant. We show that the effect of surfactant is to reduce significantly the average diameter of the entrained droplets, through a tip-streaming mechanism. We also highlight the influence of surfactant on the characteristics of the interfacial waves, and the pressure gradient that drives the flow. EPSRC Programme Grant EP/K003976/1.

  14. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Improved patient selection by stratified surgical intervention

    DEFF Research Database (Denmark)

    Wang, Miao; Bünger, Cody E; Li, Haisheng

    2015-01-01

    BACKGROUND CONTEXT: Choosing the best surgical treatment for patients with spinal metastases remains a significant challenge for spine surgeons. There is currently no gold standard for surgical treatments. The Aarhus Spinal Metastases Algorithm (ASMA) was established to help surgeons choose...... the most appropriate surgical intervention for patients with spinal metastases. PURPOSE: The purpose of this study was to evaluate the clinical outcome of stratified surgical interventions based on the ASMA, which combines life expectancy and the anatomical classification of patients with spinal metastases...... survival times in the five surgical groups determined by the ASMA were 2.1 (TS 0-4, TC 1-7), 5.1 (TS 5-8, TC 1-7), 12.1 (TS 9-11, TC 1-7 or TS 12-15, TC 7), 26.0 (TS 12-15, TC 4-6), and 36.0 (TS 12-15, TC 1-3) months. The 30-day mortality rate was 7.5%. Postoperative neurological function was maintained...

  16. Experimental study of unsteady thermally stratified flow

    International Nuclear Information System (INIS)

    Lee, Sang Jun; Chung, Myung Kyoon

    1985-01-01

    Unsteady thermally stratified flow caused by two-dimensional surface discharge of warm water into a oblong channel was investigated. Experimental study was focused on the rapidly developing thermal diffusion at small Richardson number. The basic objectives were to study the interfacial mixing between a flowing layer of warm water and an underlying body of cold water and to accumulate experimental data to test computational turbulence models. Mean velocity field measurements were carried out by using NMR-CT(Nuclear Magnetic Resonance-Computerized Tomography). It detects quantitative flow image of any desired section in any direction of flow in short time. Results show that at small Richardson number warm layer rapidly penetrates into the cold layer because of strong turbulent mixing and instability between the two layers. It is found that the transfer of heat across the interface is more vigorous than that of momentum. It is also proved that the NMR-CT technique is a very valuable tool to measure unsteady three dimensional flow field. (Author)

  17. Turbulent fluxes in stably stratified boundary layers

    International Nuclear Information System (INIS)

    L'vov, Victor S; Procaccia, Itamar; Rudenko, Oleksii

    2008-01-01

    We present here an extended version of an invited talk we gave at the international conference 'Turbulent Mixing and Beyond'. The dynamical and statistical description of stably stratified turbulent boundary layers with the important example of the stable atmospheric boundary layer in mind is addressed. Traditional approaches to this problem, based on the profiles of mean quantities, velocity second-order correlations and dimensional estimates of the turbulent thermal flux, run into a well-known difficulty, predicting the suppression of turbulence at a small critical value of the Richardson number, in contradiction to observations. Phenomenological attempts to overcome this problem suffer from various theoretical inconsistencies. Here, we present an approach taking into full account all the second-order statistics, which allows us to respect the conservation of total mechanical energy. The analysis culminates in an analytic solution of the profiles of all mean quantities and all second-order correlations, removing the unphysical predictions of previous theories. We propose that the approach taken here is sufficient to describe the lower parts of the atmospheric boundary layer, as long as the Richardson number does not exceed an order of unity. For much higher Richardson numbers, the physics may change qualitatively, requiring careful consideration of the potential Kelvin-Helmoholtz waves and their interaction with the vortical turbulence.

  18. Local properties of countercurrent stratified steam-water flow

    International Nuclear Information System (INIS)

    Kim, H.J.

    1985-10-01

    A study of steam condensation in countercurrent stratified flow of steam and subcooled water has been carried out in a rectangular channel/flat plate geometry over a wide range of inclination angles (4 0 -87 0 ) at several aspect ratios. Variables were inlet water and steam flow rates, and inlet water temperature. Local condensation rates and pressure gradients were measured, and local condensation heat transfer coefficients and interfacial shear stress were calculated. Contact probe traverses of the surface waves were made, which allowed a statistical analysis of the wave properties. The local condensation Nusselt number was correlated in terms of local water and steam Reynolds or Froude numbers, as well as the liquid Prandtl number. A turbulence-centered model developed by Theofanous, et al. principally for gas absorption in several geometries, was modified. A correlation for the interfacial shear stress and the pressure gradient agreed with measured values. Mean water layer thicknesses were calculated. Interfacial wave parameters, such as the mean water layer thickness, liquid fraction probability distribution, wave amplitude and wave frequency, are analyzed

  19. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  20. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  1. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  2. Analysis of Turbulent Combustion in Simplified Stratified Charge Conditions

    Science.gov (United States)

    Moriyoshi, Yasuo; Morikawa, Hideaki; Komatsu, Eiji

    The stratified charge combustion system has been widely studied due to the significant potentials for low fuel consumption rate and low exhaust gas emissions. The fuel-air mixture formation process in a direct-injection stratified charge engine is influenced by various parameters, such as atomization, evaporation, and in-cylinder gas motion at high temperature and high pressure conditions. It is difficult to observe the in-cylinder phenomena in such conditions and also challenging to analyze the following stratified charge combustion. Therefore, the combustion phenomena in simplified stratified charge conditions aiming to analyze the fundamental stratified charge combustion are examined. That is, an experimental apparatus which can control the mixture distribution and the gas motion at ignition timing was developed, and the effects of turbulence intensity, mixture concentration distribution, and mixture composition on stratified charge combustion were examined. As a result, the effects of fuel, charge stratification, and turbulence on combustion characteristics were clarified.

  3. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  4. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  5. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  10. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  11. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  12. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  13. Analysing stratified medicine business models and value systems: innovation-regulation interactions.

    Science.gov (United States)

    Mittra, James; Tait, Joyce

    2012-09-15

    Stratified medicine offers both opportunities and challenges to the conventional business models that drive pharmaceutical R&D. Given the increasingly unsustainable blockbuster model of drug development, due in part to maturing product pipelines, alongside increasing demands from regulators, healthcare providers and patients for higher standards of safety, efficacy and cost-effectiveness of new therapies, stratified medicine promises a range of benefits to pharmaceutical and diagnostic firms as well as healthcare providers and patients. However, the transition from 'blockbusters' to what might now be termed 'niche-busters' will require the adoption of new, innovative business models, the identification of different and perhaps novel types of value along the R&D pathway, and a smarter approach to regulation to facilitate innovation in this area. In this paper we apply the Innogen Centre's interdisciplinary ALSIS methodology, which we have developed for the analysis of life science innovation systems in contexts where the value creation process is lengthy, expensive and highly uncertain, to this emerging field of stratified medicine. In doing so, we consider the complex collaboration, timing, coordination and regulatory interactions that shape business models, value chains and value systems relevant to stratified medicine. More specifically, we explore in some depth two convergence models for co-development of a therapy and diagnostic before market authorisation, highlighting the regulatory requirements and policy initiatives within the broader value system environment that have a key role in determining the probable success and sustainability of these models. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Modelling of vapour explosion in stratified geometrie

    International Nuclear Information System (INIS)

    Picchi, St.

    1999-01-01

    When a hot liquid comes into contact with a colder volatile liquid, one can obtain in some conditions an explosive vaporization, told vapour explosion, whose consequences can be important on neighbouring structures. This explosion needs the intimate mixing and the fine fragmentation between the two liquids. In a stratified vapour explosion, these two liquids are initially superposed and separated by a vapor film. A triggering of the explosion can induce a propagation of this along the film. A study of experimental results and existent models has allowed to retain the following main points: - the explosion propagation is due to a pressure wave propagating through the medium; - the mixing is due to the development of Kelvin-Helmholtz instabilities induced by the shear velocity between the two liquids behind the pressure wave. The presence of the vapour in the volatile liquid explains experimental propagation velocity and the velocity difference between the two fluids at the pressure wave crossing. A first model has been proposed by Brayer in 1994 in order to describe the fragmentation and the mixing of the two fluids. Results of the author do not show explosion propagation. We have therefore built a new mixing-fragmentation model based on the atomization phenomenon that develops itself during the pressure wave crossing. We have also taken into account the transient aspect of the heat transfer between fuel drops and the volatile liquid, and elaborated a model of transient heat transfer. These two models have been introduced in a multi-components, thermal, hydraulic code, MC3D. Results of calculation show a qualitative and quantitative agreement with experimental results and confirm basic options of the model. (author)

  15. Experimental CFD grade data for stratified two-phase flows

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Christophe, E-mail: c.vallee@fzd.d [Forschungszentrum Dresden-Rossendorf e.V., Institute of Safety Research, D-01314 Dresden (Germany); Lucas, Dirk; Beyer, Matthias; Pietruske, Heiko; Schuetz, Peter; Carl, Helmar [Forschungszentrum Dresden-Rossendorf e.V., Institute of Safety Research, D-01314 Dresden (Germany)

    2010-09-15

    Stratified two-phase flows were investigated at two test facilities with horizontal test-sections. For both, rectangular channel cross-sections were chosen to provide optimal observation possibilities for the application of optical measurement techniques. In order to show the local flow structure, high-speed video observation was applied, which delivers the high-resolution in space and time needed for CFD code validation. The first investigations were performed in the Horizontal Air/Water Channel (HAWAC), which is made of acrylic glass and allows the investigation of air/water co-current flows at atmospheric pressure and room temperature. At the channel inlet, a special device was designed for well-defined and adjustable inlet boundary conditions. For the quantitative analysis of the optical measurements performed at the HAWAC, an algorithm was developed to recognise the stratified interface in the camera frames. This allows to make statistical treatments for comparison with CFD calculation results. As an example, the unstable wave growth leading to slug flow is shown from the test-section inlet. Moreover, the hydraulic jump as the quasi-stationary discontinuous transition between super- and subcritical flow was investigated in this closed channel. The structure of the hydraulic jump over time is revealed by the calculation of the probability density of the water level. A series of experiments show that the hydraulic jump profile and its position from the inlet vary substantially with the inlet boundary conditions due to the momentum exchange between the phases. The second channel is built in the pressure chamber of the TOPFLOW test facility, which is used to perform air/water and steam/water experiments at pressures of up to 5.0 MPa and temperatures of up to 264 {sup o}C, but under pressure equilibrium with the vessel inside. In the present experiment, the test-section represents a flat model of the hot leg of the German Konvoi pressurised water reactor scaled at

  16. Experimental CFD grade data for stratified two-phase flows

    International Nuclear Information System (INIS)

    Vallee, Christophe; Lucas, Dirk; Beyer, Matthias; Pietruske, Heiko; Schuetz, Peter; Carl, Helmar

    2010-01-01

    Stratified two-phase flows were investigated at two test facilities with horizontal test-sections. For both, rectangular channel cross-sections were chosen to provide optimal observation possibilities for the application of optical measurement techniques. In order to show the local flow structure, high-speed video observation was applied, which delivers the high-resolution in space and time needed for CFD code validation. The first investigations were performed in the Horizontal Air/Water Channel (HAWAC), which is made of acrylic glass and allows the investigation of air/water co-current flows at atmospheric pressure and room temperature. At the channel inlet, a special device was designed for well-defined and adjustable inlet boundary conditions. For the quantitative analysis of the optical measurements performed at the HAWAC, an algorithm was developed to recognise the stratified interface in the camera frames. This allows to make statistical treatments for comparison with CFD calculation results. As an example, the unstable wave growth leading to slug flow is shown from the test-section inlet. Moreover, the hydraulic jump as the quasi-stationary discontinuous transition between super- and subcritical flow was investigated in this closed channel. The structure of the hydraulic jump over time is revealed by the calculation of the probability density of the water level. A series of experiments show that the hydraulic jump profile and its position from the inlet vary substantially with the inlet boundary conditions due to the momentum exchange between the phases. The second channel is built in the pressure chamber of the TOPFLOW test facility, which is used to perform air/water and steam/water experiments at pressures of up to 5.0 MPa and temperatures of up to 264 o C, but under pressure equilibrium with the vessel inside. In the present experiment, the test-section represents a flat model of the hot leg of the German Konvoi pressurised water reactor scaled at 1

  17. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  18. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  5. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  6. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  7. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  8. Aligning the Economic Value of Companion Diagnostics and Stratified Medicines

    Directory of Open Access Journals (Sweden)

    Edward D. Blair

    2012-11-01

    Full Text Available The twin forces of payors seeking fair pricing and the rising costs of developing new medicines has driven a closer relationship between pharmaceutical companies and diagnostics companies, because stratified medicines, guided by companion diagnostics, offer better commercial, as well as clinical, outcomes. Stratified medicines have created clinical success and provided rapid product approvals, particularly in oncology, and indeed have changed the dynamic between drug and diagnostic developers. The commercial payback for such partnerships offered by stratified medicines has been less well articulated, but this has shifted as the benefits in risk management, pricing and value creation for all stakeholders become clearer. In this larger healthcare setting, stratified medicine provides both physicians and patients with greater insight on the disease and provides rationale for providers to understand cost-effectiveness of treatment. This article considers how the economic value of stratified medicine relationships can be recognized and translated into better outcomes for all healthcare stakeholders.

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  11. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  12. Pedestrian-vehicle crashes and analytical techniques for stratified contingency tables.

    Science.gov (United States)

    Al-Ghamdi, Ali S

    2002-03-01

    In 1999 there were 450 fatalities due to road crashes in Riyadh, the capital of Saudi Arabia, of which 130 were pedestrians. Hence, every fourth person killed on the roads is a pedestrian. The aim of this study is to investigate pedestrian-vehicle crashes in this fast-growing city with two objectives in mind: to analyze pedestrian collisions with regard to their causes, characteristics, location of injury on the victim's body, and most common patterns and to determine the potential for use of the odds ratio technique in the analysis of stratified contingency tables. Data from 638 pedestrian-vehicle crashes reported by police, during the period 1997-1999, were used. A systematic sampling technique was followed in which every third record was used. The analysis showed that the pedestrian fatality rate per 10(5) population is 2.8. The rates were relatively high within the childhood (1-9 years) and young adult (10-19 years) groups, and the old-age groups (60 - > 80 years), which indicate that young as well as the elderly people in this city are more likely to be involved in fatal accidents of this type than are those in other age groups. The analysis revealed that 77.1% of pedestrians were probably struck while crossing a roadway either not in a crosswalk or where no crosswalk existed. In addition, the distribution of injuries on the victims' bodies was determined from hospital records. More than one-third of the fatal injuries were located on the head and chest. An attempt was made to conduct an association analysis between crash severity (i.e. injury or fatal) and some of the study variables using chi-square and odds ratio techniques. The categorical nature of the data helped in using these analytical techniques.

  13. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  14. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  15. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  16. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  17. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  18. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  19. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  20. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  1. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  2. The stratified H-index makes scientific impact transparent

    DEFF Research Database (Denmark)

    Würtz, Morten; Schmidt, Morten

    2017-01-01

    The H-index is widely used to quantify and standardize researchers' scientific impact. However, the H-index does not account for the fact that co-authors rarely contribute equally to a paper. Accordingly, we propose the use of a stratified H-index to measure scientific impact. The stratified H......-index supplements the conventional H-index with three separate H-indices: one for first authorships, one for second authorships and one for last authorships. The stratified H-index takes scientific output, quality and individual author contribution into account....

  3. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  4. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  5. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  6. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  7. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  8. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  9. Data Interpretation: Using Probability

    Science.gov (United States)

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  10. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  11. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  12. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  13. AUDIT-C scores as a scaled marker of mean daily drinking, alcohol use disorder severity, and probability of alcohol dependence in a U.S. general population sample of drinkers.

    Science.gov (United States)

    Rubinsky, Anna D; Dawson, Deborah A; Williams, Emily C; Kivlahan, Daniel R; Bradley, Katharine A

    2013-08-01

    Brief alcohol screening questionnaires are increasingly used to identify alcohol misuse in routine care, but clinicians also need to assess the level of consumption and the severity of misuse so that appropriate intervention can be offered. Information provided by a patient's alcohol screening score might provide a practical tool for assessing the level of consumption and severity of misuse. This post hoc analysis of data from the 2001 to 2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) included 26,546 U.S. adults who reported drinking in the past year and answered additional questions about their consumption, including Alcohol Use Disorders Identification Test-Consumption questionnaire (AUDIT-C) alcohol screening. Linear or logistic regression models and postestimation methods were used to estimate mean daily drinking, the number of endorsed alcohol use disorder (AUD) criteria ("AUD severity"), and the probability of alcohol dependence associated with each individual AUDIT-C score (1 to 12), after testing for effect modification by gender and age. Among eligible past-year drinkers, mean daily drinking, AUD severity, and the probability of alcohol dependence increased exponentially across increasing AUDIT-C scores. Mean daily drinking ranged from alcohol dependence ranged from used to estimate patient-specific consumption and severity based on age, gender, and alcohol screening score. This information could be integrated into electronic decision support systems to help providers estimate and provide feedback about patient-specific risks and identify those patients most likely to benefit from further diagnostic assessment. Copyright © 2013 by the Research Society on Alcoholism.

  14. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  15. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  16. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  17. The effect of existing turbulence on stratified shear instability

    Science.gov (United States)

    Kaminski, Alexis; Smyth, William

    2017-11-01

    Ocean turbulence is an essential process governing, for example, heat uptake by the ocean. In the stably-stratified ocean interior, this turbulence occurs in discrete events driven by vertical variations of the horizontal velocity. Typically, these events have been modelled by assuming an initially laminar stratified shear flow which develops wavelike instabilities, becomes fully turbulent, and then relaminarizes into a stable state. However, in the real ocean there is always some level of turbulence left over from previous events, and it is not yet understood how this turbulence impacts the evolution of future mixing events. Here, we perform a series of direct numerical simulations of turbulent events developing in stratified shear flows that are already at least weakly turbulent. We do so by varying the amplitude of the initial perturbations, and examine the subsequent development of the instability and the impact on the resulting turbulent fluxes. This work is supported by NSF Grant OCE1537173.

  18. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  19. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  20. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  1. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  2. Large eddy simulation of turbulent and stably-stratified flows

    International Nuclear Information System (INIS)

    Fallon, Benoit

    1994-01-01

    The unsteady turbulent flow over a backward-facing step is studied by mean of Large Eddy Simulations with structure function sub grid model, both in isothermal and stably-stratified configurations. Without stratification, the flow develops highly-distorted Kelvin-Helmholtz billows, undergoing to helical pairing, with A-shaped vortices shed downstream. We show that forcing injected by recirculation fluctuations governs this oblique mode instabilities development. The statistical results show good agreements with the experimental measurements. For stably-stratified configurations, the flow remains more bi-dimensional. We show with increasing stratification, how the shear layer growth is frozen by inhibition of pairing process then of Kelvin-Helmholtz instabilities, and the development of gravity waves or stable density interfaces. Eddy structures of the flow present striking analogies with the stratified mixing layer. Additional computations show the development of secondary Kelvin-Helmholtz instabilities on the vorticity layers between two primary structures. This important mechanism based on baroclinic effects (horizontal density gradients) constitutes an additional part of the turbulent mixing process. Finally, the feasibility of Large Eddy Simulation is demonstrated for industrial flows, by studying a complex stratified cavity. Temperature fluctuations are compared to experimental measurements. We also develop three-dimensional un-stationary animations, in order to understand and visualize turbulent interactions. (author) [fr

  3. Bacterial production, protozoan grazing, and mineralization in stratified Lake Vechten

    NARCIS (Netherlands)

    Bloem, J.

    1989-01-01

    The role of heterotrophic nanoflagellates (HNAN, size 2-20 μm) in grazing on bacteria and mineralization of organic matter in stratified Lake Vechten was studied.

    Quantitative effects of manipulation and fixation on HNAN were checked. Considerable losses were caused by

  4. The dynamics of small inertial particles in weakly stratified turbulence

    NARCIS (Netherlands)

    van Aartrijk, M.; Clercx, H.J.H.

    We present an overview of a numerical study on the small-scale dynamics and the large-scale dispersion of small inertial particles in stably stratified turbulence. Three types of particles are examined: fluid particles, light inertial particles (with particle-to-fluid density ratio 1Ͽp/Ͽf25) and

  5. Dispersion of (light) inertial particles in stratified turbulence

    NARCIS (Netherlands)

    van Aartrijk, M.; Clercx, H.J.H.; Armenio, Vincenzo; Geurts, Bernardus J.; Fröhlich, Jochen

    2010-01-01

    We present a brief overview of a numerical study of the dispersion of particles in stably stratified turbulence. Three types of particles arc examined: fluid particles, light inertial particles ($\\rho_p/\\rho_f = \\mathcal{O}(1)$) and heavy inertial particles ($\\rho_p/\\rho_f \\gg 1$). Stratification

  6. Stability of Miscible Displacements Across Stratified Porous Media

    Energy Technology Data Exchange (ETDEWEB)

    Shariati, Maryam; Yortsos, Yanis C.

    2000-09-11

    This report studied macro-scale heterogeneity effects. Reflecting on their importance, current simulation practices of flow and displacement in porous media were invariably based on heterogeneous permeability fields. Here, it was focused on a specific aspect of such problems, namely the stability of miscible displacements in stratified porous media, where the displacement is perpendicular to the direction of stratification.

  7. On Internal Waves in a Density-Stratified Estuary

    NARCIS (Netherlands)

    Kranenburg, C.

    1991-01-01

    In this article some field observations, made in recent years, of internal wave motions in a density-stratified estuary are presented, In order to facilitate the appreciation of the results, and to make some quantitative comparisons, the relevant theory is also summarized. Furthermore, the origins

  8. FDTD scattered field formulation for scatterers in stratified dispersive media.

    Science.gov (United States)

    Olkkonen, Juuso

    2010-03-01

    We introduce a simple scattered field (SF) technique that enables finite difference time domain (FDTD) modeling of light scattering from dispersive objects residing in stratified dispersive media. The introduced SF technique is verified against the total field scattered field (TFSF) technique. As an application example, we study surface plasmon polariton enhanced light transmission through a 100 nm wide slit in a silver film.

  9. Plane Stratified Flow in a Room Ventilated by Displacement Ventilation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Nickel, J.; Baron, D. J. G.

    2004-01-01

    The air movement in the occupied zone of a room ventilated by displacement ventilation exists as a stratified flow along the floor. This flow can be radial or plane according to the number of wall-mounted diffusers and the room geometry. The paper addresses the situations where plane flow...

  10. Dual Spark Plugs For Stratified-Charge Rotary Engine

    Science.gov (United States)

    Abraham, John; Bracco, Frediano V.

    1996-01-01

    Fuel efficiency of stratified-charge, rotary, internal-combustion engine increased by improved design featuring dual spark plugs. Second spark plug ignites fuel on upstream side of main fuel injector; enabling faster burning and more nearly complete utilization of fuel.

  11. Prognosis research strategy (PROGRESS) 4: Stratified medicine research

    NARCIS (Netherlands)

    A. Hingorani (Aroon); D.A.W.M. van der Windt (Daniëlle); R.D. Riley (Richard); D. Abrams; K.G.M. Moons (Karel); E.W. Steyerberg (Ewout); S. Schroter (Sara); W. Sauerbrei (Willi); D.G. Altman (Douglas); H. Hemingway; A. Briggs (Andrew); N. Brunner; P. Croft (Peter); J. Hayden (Jill); P.A. Kyzas (Panayiotis); N. Malats (Núria); G. Peat; P. Perel (Pablo); I. Roberts (Ian); A. Timmis (Adam)

    2013-01-01

    textabstractIn patients with a particular disease or health condition, stratified medicine seeks to identify thosewho will have the most clinical benefit or least harm from a specific treatment. In this article, thefourth in the PROGRESS series, the authors discuss why prognosis research should form

  12. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  13. Impressions of the turbulence variability in a weakly stratified, flat-bottom deep-sea ‘boundary layer’

    NARCIS (Netherlands)

    van Haren, H.

    2015-01-01

    The character of turbulent overturns in a weakly stratified deep-sea is investigated in some detail using 144 high-resolution temperature sensors at 0.7 m intervals, starting 5 m above the bottom. A 9-day, 1 Hz sampled record from the 912 m depth flat-bottom (<0.5% bottom-slope) mooring site in the

  14. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  15. Effect of slaughterhouse and day of sample on the probability of a pig carcass being Salmonella-positive according to the Enterobacteriaceae count in the largest Brazilian pork production region

    DEFF Research Database (Denmark)

    Corbellini, Luis Gustavo; Júnior, Alfredo Bianco; de Freitas Costa, Eduardo

    2016-01-01

    Sources of contamination of carcasses during slaughter include infected pigs as well as environmentally related sources. There are many microbial indicators that can be used in the processing of food to assess food hygiene and the safety of food processing. The presence of some microbial indicators...... can be viewed as a result of direct or indirect contamination of a food with fecal material. The presence of Enterobacteriaceae is often used as a hygiene indicator, as they are found both in the environment and in the intestine of warm-blooded animals. An association between Salmonella isolation...... and Enterobacteriaceae count (EC) on pre-chill carcasses has been described, however the impact of slaughterhouse and the day of sampling on the occurrence of Salmonella has not been previously investigated. To this end, mixed logistic regressions (MLRs) with random effects and fixed slopes were performed to assess...

  16. Exploring the role of wave drag in the stable stratified oceanic and atmospheric bottom boundary layer in the cnrs-toulouse (cnrm-game) large stratified water flume

    NARCIS (Netherlands)

    Kleczek, M.; Steeneveld, G.J.; Paci, A.; Calmer, R.; Belleudy, A.; Canonici, J.C.; Murguet, F.; Valette, V.

    2014-01-01

    This paper reports on a laboratory experiment in the CNRM-GAME (Toulouse) stratified water flume of a stably stratified boundary layer, in order to quantify the momentum transfer due to orographically induced gravity waves by gently undulating hills in a boundary layer flow. In a stratified fluid, a

  17. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  18. Simulation of steam explosion in stratified melt-coolant configuration

    International Nuclear Information System (INIS)

    Leskovar, Matjaž; Centrih, Vasilij; Uršič, Mitja

    2016-01-01

    Highlights: • Strong steam explosions may develop spontaneously in stratified configurations. • Considerable melt-coolant premixed layer formed in subcooled water with hot melts. • Analysis with MC3D code provided insight into stratified steam explosion phenomenon. • Up to 25% of poured melt was mixed with water and available for steam explosion. • Better instrumented experiments needed to determine dominant mixing process. - Abstract: A steam explosion is an energetic fuel coolant interaction process, which may occur during a severe reactor accident when the molten core comes into contact with the coolant water. In nuclear reactor safety analyses steam explosions are primarily considered in melt jet-coolant pool configurations where sufficiently deep coolant pool conditions provide complete jet breakup and efficient premixture formation. Stratified melt-coolant configurations, i.e. a molten melt layer below a coolant layer, were up to now believed as being unable to generate strong explosive interactions. Based on the hypothesis that there are no interfacial instabilities in a stratified configuration it was assumed that the amount of melt in the premixture is insufficient to produce strong explosions. However, the recently performed experiments in the PULiMS and SES (KTH, Sweden) facilities with oxidic corium simulants revealed that strong steam explosions may develop spontaneously also in stratified melt-coolant configurations, where with high temperature melts and subcooled water conditions a considerable melt-coolant premixed layer is formed. In the article, the performed study of steam explosions in a stratified melt-coolant configuration in PULiMS like conditions is presented. The goal of this analytical work is to supplement the experimental activities within the PULiMS research program by addressing the key questions, especially regarding the explosivity of the formed premixed layer and the mechanisms responsible for the melt-water mixing. To

  19. Structural and Behavioral Correlates of HIV Infection among Pregnant Women in a Country with a Highly Generalized HIV Epidemic: A Cross-Sectional Study with a Probability Sample of Antenatal Care Facilities in Swaziland.

    Science.gov (United States)

    Lukhele, Bhekumusa Wellington; Techasrivichien, Teeranee; Suguimoto, S Pilar; Musumari, Patou Masika; El-Saaidi, Christina; Haumba, Samson; Tagutanazvo, Oslinah Buru; Ono-Kihara, Masako; Kihara, Masahiro

    2016-01-01

    HIV disproportionately affects women in Sub-Saharan Africa. Swaziland bears the highest HIV prevalence of 41% among pregnant women in this region. This heightened HIV-epidemic reflects the importance of context-specific interventions. Apart from routine HIV surveillance, studies that examine structural and behavioral factors associated with HIV infection among women may facilitate the revitalization of existing programs and provide insights to inform context-specific HIV prevention interventions. This cross-sectional study employed a two-stage random cluster sampling in ten antenatal health care facilities in the Hhohho region of Swaziland in August and September 2015. Participants were eligible for the study if they were 18 years or older and had tested for HIV. Self-administered tablet-based questionnaires were used to assess HIV risk factors. Of all eligible pregnant women, 827 (92.4%) participated, out of which 297 (35.9%) were self-reportedly HIV positive. Among structural factors, family function was not significantly associated with self-reported HIV positive status, while lower than high school educational attainment (AOR, 1.65; CI, 1.14-3.38; P = 0.008), and income below minimum wage (AOR, 1.81; CI, 1.09-3.01; P = 0.021) were significantly associated with self-reported HIV positive status. Behavioral factors significantly associated with reporting a positive HIV status included; ≥2 lifetime sexual partners (AOR, 3.16; CI, 2.00-5.00; PHIV/AIDS-related knowledge level was high but not associated to self-reported HIV status (P = 0.319). Structural and behavioral factors showed significant association with self-reported HIV infection among pregnant women in Swaziland while HIV/AIDS-related knowledge and family function did not. This suggests that HIV interventions should be reinforced taking into consideration these findings. The findings also suggest the importance of future research sensitive to the Swazi and African sociocultural contexts, especially

  20. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  1. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  2. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  3. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  4. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  5. Structural and Behavioral Correlates of HIV Infection among Pregnant Women in a Country with a Highly Generalized HIV Epidemic: A Cross-Sectional Study with a Probability Sample of Antenatal Care Facilities in Swaziland.

    Directory of Open Access Journals (Sweden)

    Bhekumusa Wellington Lukhele

    Full Text Available HIV disproportionately affects women in Sub-Saharan Africa. Swaziland bears the highest HIV prevalence of 41% among pregnant women in this region. This heightened HIV-epidemic reflects the importance of context-specific interventions. Apart from routine HIV surveillance, studies that examine structural and behavioral factors associated with HIV infection among women may facilitate the revitalization of existing programs and provide insights to inform context-specific HIV prevention interventions.This cross-sectional study employed a two-stage random cluster sampling in ten antenatal health care facilities in the Hhohho region of Swaziland in August and September 2015. Participants were eligible for the study if they were 18 years or older and had tested for HIV. Self-administered tablet-based questionnaires were used to assess HIV risk factors. Of all eligible pregnant women, 827 (92.4% participated, out of which 297 (35.9% were self-reportedly HIV positive. Among structural factors, family function was not significantly associated with self-reported HIV positive status, while lower than high school educational attainment (AOR, 1.65; CI, 1.14-3.38; P = 0.008, and income below minimum wage (AOR, 1.81; CI, 1.09-3.01; P = 0.021 were significantly associated with self-reported HIV positive status. Behavioral factors significantly associated with reporting a positive HIV status included; ≥2 lifetime sexual partners (AOR, 3.16; CI, 2.00-5.00; P<0.001, and ever cohabited (AOR, 2.39; CI, 1.66-3.43; P = 0.00. The most cited reason for having multiple sexual partners was financial gain. HIV/AIDS-related knowledge level was high but not associated to self-reported HIV status (P = 0.319.Structural and behavioral factors showed significant association with self-reported HIV infection among pregnant women in Swaziland while HIV/AIDS-related knowledge and family function did not. This suggests that HIV interventions should be reinforced taking into

  6. Identification of major planktonic sulfur oxidizers in stratified freshwater lake.

    Directory of Open Access Journals (Sweden)

    Hisaya Kojima

    Full Text Available Planktonic sulfur oxidizers are important constituents of ecosystems in stratified water bodies, and contribute to sulfide detoxification. In contrast to marine environments, taxonomic identities of major planktonic sulfur oxidizers in freshwater lakes still remain largely unknown. Bacterioplankton community structure was analyzed in a stratified freshwater lake, Lake Mizugaki in Japan. In the clone libraries of 16S rRNA gene, clones very closely related to a sulfur oxidizer isolated from this lake, Sulfuritalea hydrogenivorans, were detected in deep anoxic water, and occupied up to 12.5% in each library of different water depth. Assemblages of planktonic sulfur oxidizers were specifically analyzed by constructing clone libraries of genes involved in sulfur oxidation, aprA, dsrA, soxB and sqr. In the libraries, clones related to betaproteobacteria were detected with high frequencies, including the close relatives of Sulfuritalea hydrogenivorans.

  7. Study of MRI in stratified viscous plasma configuration

    Science.gov (United States)

    Carlevaro, Nakia; Montani, Giovanni; Renzi, Fabrizio

    2017-02-01

    We analyze the morphology of the magneto-rotational instability (MRI) for a stratified viscous plasma disk configuration in differential rotation, taking into account the so-called corotation theorem for the background profile. In order to select the intrinsic Alfvénic nature of MRI, we deal with an incompressible plasma and we adopt a formulation of the local perturbation analysis based on the use of the magnetic flux function as a dynamical variable. Our study outlines, as consequence of the corotation condition, a marked asymmetry of the MRI with respect to the equatorial plane, particularly evident in a complete damping of the instability over a positive critical height on the equatorial plane. We also emphasize how such a feature is already present (although less pronounced) even in the ideal case, restoring a dependence of the MRI on the stratified morphology of the gravitational field.

  8. Mixing of stratified flow around bridge piers in steady current

    DEFF Research Database (Denmark)

    Jensen, Bjarne; Carstensen, Stefan; Christensen, Erik Damgaard

    2018-01-01

    This paper presents the results of an experimental and numerical investigation of the mixing of stratified flow around bridge pier structures. In this study, which was carried out in connection with the Fehmarnbelt Fixed Link environmental impact assessment, the mixing processes of two-layer stra......This paper presents the results of an experimental and numerical investigation of the mixing of stratified flow around bridge pier structures. In this study, which was carried out in connection with the Fehmarnbelt Fixed Link environmental impact assessment, the mixing processes of two......-layer stratification was studied in which the lower level had a higher salinity than the upper layer. The physical experiments investigated two different pier designs. A general study was made regarding forces on the piers in which the effect of the current angle relative to the structure was also included...

  9. Stratified charge rotary aircraft engine technology enablement program

    Science.gov (United States)

    Badgley, P. R.; Irion, C. E.; Myers, D. M.

    1985-01-01

    The multifuel stratified charge rotary engine is discussed. A single rotor, 0.7L/40 cu in displacement, research rig engine was tested. The research rig engine was designed for operation at high speeds and pressures, combustion chamber peak pressure providing margin for speed and load excursions above the design requirement for a high is advanced aircraft engine. It is indicated that the single rotor research rig engine is capable of meeting the established design requirements of 120 kW, 8,000 RPM, 1,379 KPA BMEP. The research rig engine, when fully developed, will be a valuable tool for investigating, advanced and highly advanced technology components, and provide an understanding of the stratified charge rotary engine combustion process.

  10. Analysis of photonic band-gap structures in stratified medium

    DEFF Research Database (Denmark)

    Tong, Ming-Sze; Yinchao, Chen; Lu, Yilong

    2005-01-01

    in electromagnetic and microwave applications once the Maxwell's equations are appropriately modeled. Originality/value - The method validates its values and properties through extensive studies on regular and defective 1D PBG structures in stratified medium, and it can be further extended to solving more......Purpose - To demonstrate the flexibility and advantages of a non-uniform pseudo-spectral time domain (nu-PSTD) method through studies of the wave propagation characteristics on photonic band-gap (PBG) structures in stratified medium Design/methodology/approach - A nu-PSTD method is proposed...... in solving the Maxwell's equations numerically. It expands the temporal derivatives using the finite differences, while it adopts the Fourier transform (FT) properties to expand the spatial derivatives in Maxwell's equations. In addition, the method makes use of the chain-rule property in calculus together...

  11. Community genomics among stratified microbial assemblages in the ocean's interior

    DEFF Research Database (Denmark)

    DeLong, Edward F; Preston, Christina M; Mincer, Tracy

    2006-01-01

    Microbial life predominates in the ocean, yet little is known about its genomic variability, especially along the depth continuum. We report here genomic analyses of planktonic microbial communities in the North Pacific Subtropical Gyre, from the ocean's surface to near-sea floor depths. Sequence......, and host-viral interactions. Comparative genomic analyses of stratified microbial communities have the potential to provide significant insight into higher-order community organization and dynamics....

  12. Large Eddy Simulation of stratified flows over structures

    OpenAIRE

    Brechler J.; Fuka V.

    2013-01-01

    We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model) to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  13. Large Eddy Simulation of stratified flows over structures

    Directory of Open Access Journals (Sweden)

    Brechler J.

    2013-04-01

    Full Text Available We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  14. Large Eddy Simulation of stratified flows over structures

    Science.gov (United States)

    Fuka, V.; Brechler, J.

    2013-04-01

    We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model) to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  15. Propagation of acoustic waves in a stratified atmosphere, 1

    Science.gov (United States)

    Kalkofen, W.; Rossi, P.; Bodo, G.; Massaglia, S.

    1994-01-01

    This work is motivated by the chromospheric 3 minute oscillations observed in the K(sub 2v) bright points. We study acoustic gravity waves in a one-dimensional, gravitationally stratified, isothermal atmosphere. The oscillations are excited either by a velocity pulse imparted to a layer in an atmosphere of infinite vertical extent, or by a piston forming the lower boundary of a semi-infinite medium. We consider both linear and non-linear waves.

  16. A statistical mechanics approach to mixing in stratified fluids

    OpenAIRE

    Venaille , Antoine; Gostiaux , Louis; Sommeria , Joël

    2016-01-01

    Accepted for the Journal of Fluid Mechanics; Predicting how much mixing occurs when a given amount of energy is injected into a Boussinesq fluid is a longstanding problem in stratified turbulence. The huge number of degrees of freedom involved in these processes renders extremely difficult a deterministic approach to the problem. Here we present a statistical mechanics approach yielding a prediction for a cumulative, global mixing efficiency as a function of a global Richard-son number and th...

  17. Sutudy on exchange flow under the unstably stratified field

    OpenAIRE

    文沢, 元雄

    2005-01-01

    This paper deals with the exchange flow under the unstably stratified field. The author developed the effective measurement system as well as the numerical analysis program. The system and the program are applied to the helium-air exchange flow in a rectangular channel with inclination. Following main features of the exchange flow were discussed based on the calculated results.(1) Time required for establishing a quasi-steady state exchange flow.(2) The relationship between the inclination an...

  18. Nitrogen Isotopes and Chemocline Depth in Stratified Basins

    Science.gov (United States)

    Fulton, J. M.; Arthur, M. A.

    2006-12-01

    Black shale samples commonly have bulk δ15N values below 0‰, previously interpreted as the result of bacterial nitrogen fixation. In this study we examine the effect of chemocline depth on the δ15N values of water column particulate matter and sediments of two meromictic basins. In particular, we produced δ15N profiles of bulk Black Sea sediments, bulk sediments from Fayetteville Green Lake (FGL), and nutrients and particulates from FGL. We also analyzed pigments from FGL samples to trace the occurrences of deep-dwelling bacteria. Our results suggest that a shallow chemocline leads to relatively 15N-depleted sediments in the absence of nitrogen fixation, probably due to increased availability of ammonium for growth near the chemocline. FGL is meromictic with a shallow chemocline at 20 meters. Ammonium released in the monimolimnion and sediments supports productivity of cyanobacteria and purple (PSB) and green sulfur bacteria near and below the chemocline. The PSB at 20m generate 15N-depleted biomass (δ15N = -3‰), compared with 0 to 3‰ for deep water ammonium. High concentrations of Bchl a extracted from particulate matter at deeper depths, where high sulfide concentrations inhibit PSB growth, suggest that sinking particulate matter contains PSB biomass, transmitting the 15N-depleted signal to the sediments. The Black Sea chemocline depth has varied over the past 7500 years. Published biomarker and pyrite framboid size data suggest that a shallow chemocline persisted through much of the past 7500 years, except for three intervals when the chemocline was deeper than 205 meters. We have measured bulk δ15N on six cores spanning depths from 205 to 2088 meters. Each of the deep chemocline intervals coincides with basin-wide sedimentary δ15N values between 2 and 4‰, compared with values near or below 0‰ for periods characterized by a shallower chemocline. The most 15N-depleted values probably result from a much shallower chemocline than that at present

  19. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  20. Ethanol dehydration to ethylene in a stratified autothermal millisecond reactor.

    Science.gov (United States)

    Skinner, Michael J; Michor, Edward L; Fan, Wei; Tsapatsis, Michael; Bhan, Aditya; Schmidt, Lanny D

    2011-08-22

    The concurrent decomposition and deoxygenation of ethanol was accomplished in a stratified reactor with 50-80 ms contact times. The stratified reactor comprised an upstream oxidation zone that contained Pt-coated Al(2)O(3) beads and a downstream dehydration zone consisting of H-ZSM-5 zeolite films deposited on Al(2)O(3) monoliths. Ethanol conversion, product selectivity, and reactor temperature profiles were measured for a range of fuel:oxygen ratios for two autothermal reactor configurations using two different sacrificial fuel mixtures: a parallel hydrogen-ethanol feed system and a series methane-ethanol feed system. Increasing the amount of oxygen relative to the fuel resulted in a monotonic increase in ethanol conversion in both reaction zones. The majority of the converted carbon was in the form of ethylene, where the ethanol carbon-carbon bonds stayed intact while the oxygen was removed. Over 90% yield of ethylene was achieved by using methane as a sacrificial fuel. These results demonstrate that noble metals can be successfully paired with zeolites to create a stratified autothermal reactor capable of removing oxygen from biomass model compounds in a compact, continuous flow system that can be configured to have multiple feed inputs, depending on process restrictions. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Background stratified Poisson regression analysis of cohort data

    International Nuclear Information System (INIS)

    Richardson, David B.; Langholz, Bryan

    2012-01-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  2. Stratified turbulent Bunsen flames: flame surface analysis and flame surface density modelling

    Science.gov (United States)

    Ramaekers, W. J. S.; van Oijen, J. A.; de Goey, L. P. H.

    2012-12-01

    In this paper it is investigated whether the Flame Surface Density (FSD) model, developed for turbulent premixed combustion, is also applicable to stratified flames. Direct Numerical Simulations (DNS) of turbulent stratified Bunsen flames have been carried out, using the Flamelet Generated Manifold (FGM) reduction method for reaction kinetics. Before examining the suitability of the FSD model, flame surfaces are characterized in terms of thickness, curvature and stratification. All flames are in the Thin Reaction Zones regime, and the maximum equivalence ratio range covers 0.1⩽φ⩽1.3. For all flames, local flame thicknesses correspond very well to those observed in stretchless, steady premixed flamelets. Extracted curvature radii and mixing length scales are significantly larger than the flame thickness, implying that the stratified flames all burn in a premixed mode. The remaining challenge is accounting for the large variation in (subfilter) mass burning rate. In this contribution, the FSD model is proven to be applicable for Large Eddy Simulations (LES) of stratified flames for the equivalence ratio range 0.1⩽φ⩽1.3. Subfilter mass burning rate variations are taken into account by a subfilter Probability Density Function (PDF) for the mixture fraction, on which the mass burning rate directly depends. A priori analysis point out that for small stratifications (0.4⩽φ⩽1.0), the replacement of the subfilter PDF (obtained from DNS data) by the corresponding Dirac function is appropriate. Integration of the Dirac function with the mass burning rate m=m(φ), can then adequately model the filtered mass burning rate obtained from filtered DNS data. For a larger stratification (0.1⩽φ⩽1.3), and filter widths up to ten flame thicknesses, a β-function for the subfilter PDF yields substantially better predictions than a Dirac function. Finally, inclusion of a simple algebraic model for the FSD resulted only in small additional deviations from DNS data

  3. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  4. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  5. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  6. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  8. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  9. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  10. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  11. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  12. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found by asse...

  13. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  14. Psychomotor development and learning difficulties in preschool children with probable attention deficit hyperactivity disorder: An epidemiological study in Navarre and La Rioja.

    Science.gov (United States)

    Marín-Méndez, J J; Borra-Ruiz, M C; Álvarez-Gómez, M J; Soutullo Esperón, C

    2017-10-01

    ADHD symptoms begin to appear at preschool age. ADHD may have a significant negative impact on academic performance. In Spain, there are no standardized tools for detecting ADHD at preschool age, nor is there data about the incidence of this disorder. To evaluate developmental factors and learning difficulties associated with probable ADHD and to assess the impact of ADHD in school performance. We conducted a population-based study with a stratified multistage proportional cluster sample design. We found significant differences between probable ADHD and parents' perception of difficulties in expressive language, comprehension, and fine motor skills, as well as in emotions, concentration, behaviour, and relationships. Around 34% of preschool children with probable ADHD showed global learning difficulties, mainly in patients with the inattentive type. According to the multivariate analysis, learning difficulties were significantly associated with both delayed psychomotor development during the first 3 years of life (OR: 5.57) as assessed by parents, and probable ADHD (OR: 2.34) CONCLUSIONS: There is a connection between probable ADHD in preschool children and parents' perception of difficulties in several dimensions of development and learning. Early detection of ADHD at preschool ages is necessary to start prompt and effective clinical and educational interventions. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  16. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  17. Numerical simulations of the stratified oceanic bottom boundary layer

    Science.gov (United States)

    Taylor, John R.

    Numerical simulations are used to consider several problems relevant to the turbulent oceanic bottom boundary layer. In the first study, stratified open channel flow is considered with thermal boundary conditions chosen to approximate a shallow sea. Specifically, a constant heat flux is applied at the free surface and the lower wall is assumed to be adiabatic. When the surface heat flux is strong, turbulent upwellings of low speed fluid from near the lower wall are inhibited by the stable stratification. Subsequent studies consider a stratified bottom Ekman layer over a non-sloping lower wall. The influence of the free surface is removed by using an open boundary condition at the top of the computational domain. Particular attention is paid to the influence of the outer layer stratification on the boundary layer structure. When the density field is initialized with a linear profile, a turbulent mixed layer forms near the wall, which is separated from the outer layer by a strongly stable pycnocline. It is found that the bottom stress is not strongly affected by the outer layer stratification. However, stratification reduces turbulent transport to the outer layer and strongly limits the boundary layer height. The mean shear at the top of the boundary layer is enhanced when the outer layer is stratified, and this shear is strong enough to cause intermittent instabilities above the pycnocline. Turbulence-generated internal gravity waves are observed in the outer layer with a relatively narrow frequency range. An explanation for frequency content of these waves is proposed, starting with an observed broad-banded turbulent spectrum and invoking linear viscous decay to explain the preferential damping of low and high frequency waves. During the course of this work, an open-source computational fluid dynamics code has been developed with a number of advanced features including scalar advection, subgrid-scale models for large-eddy simulation, and distributed memory

  18. E25 stratified torch ignition engine emissions and combustion analysis

    International Nuclear Information System (INIS)

    Rodrigues Filho, Fernando Antonio; Baêta, José Guilherme Coelho; Teixeira, Alysson Fernandes; Valle, Ramón Molina; Fonseca de Souza, José Leôncio

    2016-01-01

    Highlights: • A stratified torch ignition (STI) engine was built and tested. • The STI engines was tested in a wide range of load and speed. • Significant reduction on emissions was achieved by means of the STI system. • Low cyclic variability characterized the lean combustion process of the torch ignition engine. • HC emission is the main drawback of the stratified torch ignition engine. - Abstract: Vehicular emissions significantly increase atmospheric air pollution and greenhouse gases (GHG). This fact associated with fast global vehicle fleet growth calls for prompt scientific community technological solutions in order to promote a significant reduction in vehicle fuel consumption and emissions, especially of fossil fuels to comply with future legislation. To meet this goal, a prototype stratified torch ignition (STI) engine was built from a commercial existing baseline engine. In this system, combustion starts in a pre-combustion chamber, where the pressure increase pushes the combustion jet flames through calibrated nozzles to be precisely targeted into the main chamber. These combustion jet flames are endowed with high thermal and kinetic energy, being able to generate a stable lean combustion process. The high kinetic and thermal energy of the combustion jet flame results from the load stratification. This is carried out through direct fuel injection in the pre-combustion chamber by means of a prototype gasoline direct injector (GDI) developed for a very low fuel flow rate. In this work the engine out-emissions of CO, NOx, HC and CO_2 of the STI engine are presented and a detailed analysis supported by the combustion parameters is conducted. The results obtained in this work show a significant decrease in the specific emissions of CO, NOx and CO_2 of the STI engine in comparison with the baseline engine. On the other hand, HC specific emission increased due to wall wetting from the fuel hitting in the pre-combustion chamber wall.

  19. Direct contact condensation induced transition from stratified to slug flow

    International Nuclear Information System (INIS)

    Strubelj, Luka; Ezsoel, Gyoergy; Tiselj, Iztok

    2010-01-01

    Selected condensation-induced water hammer experiments performed on PMK-2 device were numerically modelled with three-dimensional two-fluid models of computer codes NEPTUNE C FD and CFX. Experimental setup consists of the horizontal pipe filled with the hot steam that is being slowly flooded with cold water. In most of the experimental cases, slow flooding of the pipe was abruptly interrupted by a strong slugging and water hammer, while in the selected experimental runs performed at higher initial pressures and temperatures that are analysed in the present work, the transition from the stratified into the slug flow was not accompanied by the water hammer pressure peak. That makes these cases more suitable tests for evaluation of the various condensation models in the horizontally stratified flows and puts them in the range of the available CFD (Computational Fluid Dynamics) codes. The key models for successful simulation appear to be the condensation model of the hot vapour on the cold liquid and the interfacial momentum transfer model. The surface renewal types of condensation correlations, developed for condensation in the stratified flows, were used in the simulations and were applied also in the regions of the slug flow. The 'large interface' model for inter-phase momentum transfer model was compared to the bubble drag model. The CFD simulations quantitatively captured the main phenomena of the experiments, while the stochastic nature of the particular condensation-induced water hammer experiments did not allow detailed prediction of the time and position of the slug formation in the pipe. We have clearly shown that even the selected experiments without water hammer present a tough test for the applied CFD codes, while modelling of the water hammer pressure peaks in two-phase flow, being a strongly compressible flow phenomena, is beyond the capability of the current CFD codes.

  20. Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.

    Science.gov (United States)

    Anisimov, Vladimir V

    2011-01-01

    This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.

  1. Technetium reduction and removal in a stratified fjord

    International Nuclear Information System (INIS)

    Keith-Roach, M.; Roos, P.

    2002-01-01

    The distribution of Tc in the water columns of a stratified fjord has been measured to investigate the behaviour and fate of Tc on reaching reducing waters. Slow mixing in the water column of the fjord results in vertical transport of the dissolved Tc to the oxic/anoxic interface. Tc is reduced just below the interface and at 21 m 60% is sorbed to particulate and colloidal material. Tc is carried to the sediments sorbed to the particulate material, where there is a current inventory of approximately 3 Bq m -2 . (LN)

  2. Stability of unstably stratified shear flow between parallel plates

    Energy Technology Data Exchange (ETDEWEB)

    Fujimura, Kaoru; Kelly, R E

    1987-09-01

    The linear stability of unstably stratified shear flows between two horizontal parallel plates was investigated. Eigenvalue problems were solved numerically by making use of the expansion method in Chebyshev polynomials, and the critical Rayleigh numbers were obtained accurately in the Reynolds number range of (0.01, 100). It was found that the critical Rayleigh number increases with an increase of the Reynolds number. The result strongly supports previous stability analyses except for the analysis by Makino and Ishikawa (J. Jpn. Soc. Fluid Mech. 4 (1985) 148 - 158) in which a decrease of the critical Rayleigh number was obtained.

  3. Stability of unstably stratified shear flow between parallel plates

    International Nuclear Information System (INIS)

    Fujimura, Kaoru; Kelly, R.E.

    1987-01-01

    The linear stability of unstably stratified shear flows between two horizontal parallel plates was investigated. Eigenvalue problems were solved numerically by making use of the expansion method in Chebyshev polynomials, and the critical Rayleigh numbers were obtained accurately in the Reynolds number range of [0.01, 100]. It was found that the critical Rayleigh number increases with an increase of the Reynolds number. The result strongly supports previous stability analyses except for the analysis by Makino and Ishikawa [J. Jpn. Soc. Fluid Mech. 4 (1985) 148 - 158] in which a decrease of the critical Rayleigh number was obtained. (author)

  4. Stratifying patients with peripheral neuropathic pain based on sensory profiles

    DEFF Research Database (Denmark)

    Vollert, Jan; Maier, Christoph; Attal, Nadine

    2017-01-01

    In a recent cluster analysis, it has been shown that patients with peripheral neuropathic pain can be grouped into 3 sensory phenotypes based on quantitative sensory testing profiles, which are mainly characterized by either sensory loss, intact sensory function and mild thermal hyperalgesia and...... populations that need to be screened to reach a subpopulation large enough to conduct a phenotype-stratified study. The most common phenotype in diabetic polyneuropathy was sensory loss (83%), followed by mechanical hyperalgesia (75%) and thermal hyperalgesia (34%, note that percentages are overlapping...

  5. Technetium reduction and removal in a stratified fjord

    Energy Technology Data Exchange (ETDEWEB)

    Keith-Roach, M.; Roos, P. [Risoe National Lab., Roskilde (Denmark)

    2002-04-01

    The distribution of Tc in the water columns of a stratified fjord has been measured to investigate the behaviour and fate of Tc on reaching reducing waters. Slow mixing in the water column of the fjord results in vertical transport of the dissolved Tc to the oxic/anoxic interface. Tc is reduced just below the interface and at 21 m 60% is sorbed to particulate and colloidal material. Tc is carried to the sediments sorbed to the particulate material, where there is a current inventory of approximately 3 Bq m{sup -2}. (LN)

  6. Development of a natural gas stratified charge rotary engine

    Energy Technology Data Exchange (ETDEWEB)

    Sierens, R.; Verdonck, W.

    1985-01-01

    A water model has been used to determine the positions of separate inlet ports for a natural gas, stratified charge rotary engine. The flow inside the combustion chamber (mainly during the induction period) has been registered by a film camera. From these tests the best locations of the inlet ports have been obtained, a prototype of this engine has been built by Audi NSU and tested in the laboratories of the university of Gent. The results of these tests, for different stratification configurations, are given. These results are comparable with the best results obtained by Audi NSU for a homogeneous natural gas rotary engine.

  7. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  9. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  10. PSA, subjective probability and decision making

    International Nuclear Information System (INIS)

    Clarotti, C.A.

    1989-01-01

    PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven

  11. SOLUTION OF A MULTIVARIATE STRATIFIED SAMPLING PROBLEM THROUGH CHEBYSHEV GOAL PROGRAMMING

    Directory of Open Access Journals (Sweden)

    Mohd. Vaseem Ismail

    2010-12-01

    Full Text Available In this paper, we consider the problem of minimizing the variances for the various characters with fixed (given budget. Each convex objective function is first linearised at its minimal point where it meets the linear cost constraint. The resulting multiobjective linear programming problem is then solved by Chebyshev goal programming. A numerical example is given to illustrate the procedure.

  12. Model-based estimation of finite population total in stratified sampling

    African Journals Online (AJOL)

    The work presented in this paper concerns the estimation of finite population total under model – based framework. Nonparametric regression approach as a method of estimating finite population total is explored. The asymptotic properties of the estimators based on nonparametric regression are also developed under ...

  13. ENVIRONMENTALLY STRATIFIED SAMPLING DESIGN FOR THE DEVELOPMENT OF THE GREAT LAKES ENVIRONMENTAL INDICATORS

    Science.gov (United States)

    Ecological indicators must be shown to be responsive to stress. For large-scale observational studies the best way to demonstrate responsiveness is by evaluating indicators along a gradient of stress, but such gradients are often unknown for a population of sites prior to site se...

  14. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  15. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  16. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  17. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  18. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  19. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  20. Crystallization of a compositionally stratified basal magma ocean

    Science.gov (United States)

    Laneuville, Matthieu; Hernlund, John; Labrosse, Stéphane; Guttenberg, Nicholas

    2018-03-01

    Earth's ∼3.45 billion year old magnetic field is regenerated by dynamo action in its convecting liquid metal outer core. However, convection induces an isentropic thermal gradient which, coupled with a high core thermal conductivity, results in rapid conducted heat loss. In the absence of implausibly high radioactivity or alternate sources of motion to drive the geodynamo, the Earth's early core had to be significantly hotter than the melting point of the lower mantle. While the existence of a dense convecting basal magma ocean (BMO) has been proposed to account for high early core temperatures, the requisite physical and chemical properties for a BMO remain controversial. Here we relax the assumption of a well-mixed convecting BMO and instead consider a BMO that is initially gravitationally stratified owing to processes such as mixing between metals and silicates at high temperatures in the core-mantle boundary region during Earth's accretion. Using coupled models of crystallization and heat transfer through a stratified BMO, we show that very high temperatures could have been trapped inside the early core, sequestering enough heat energy to run an ancient geodynamo on cooling power alone.

  1. Dyadic Green's function of an eccentrically stratified sphere.

    Science.gov (United States)

    Moneda, Angela P; Chrissoulidis, Dimitrios P

    2014-03-01

    The electric dyadic Green's function (dGf) of an eccentrically stratified sphere is built by use of the superposition principle, dyadic algebra, and the addition theorem of vector spherical harmonics. The end result of the analytical formulation is a set of linear equations for the unknown vector wave amplitudes of the dGf. The unknowns are calculated by truncation of the infinite sums and matrix inversion. The theory is exact, as no simplifying assumptions are required in any one of the analytical steps leading to the dGf, and it is general in the sense that any number, position, size, and electrical properties can be considered for the layers of the sphere. The point source can be placed outside of or in any lossless part of the sphere. Energy conservation, reciprocity, and other checks verify that the dGf is correct. A numerical application is made to a stratified sphere made of gold and glass, which operates as a lens.

  2. Crenothrix are major methane consumers in stratified lakes.

    Science.gov (United States)

    Oswald, Kirsten; Graf, Jon S; Littmann, Sten; Tienken, Daniela; Brand, Andreas; Wehrli, Bernhard; Albertsen, Mads; Daims, Holger; Wagner, Michael; Kuypers, Marcel Mm; Schubert, Carsten J; Milucka, Jana

    2017-09-01

    Methane-oxidizing bacteria represent a major biological sink for methane and are thus Earth's natural protection against this potent greenhouse gas. Here we show that in two stratified freshwater lakes a substantial part of upward-diffusing methane was oxidized by filamentous gamma-proteobacteria related to Crenothrix polyspora. These filamentous bacteria have been known as contaminants of drinking water supplies since 1870, but their role in the environmental methane removal has remained unclear. While oxidizing methane, these organisms were assigned an 'unusual' methane monooxygenase (MMO), which was only distantly related to 'classical' MMO of gamma-proteobacterial methanotrophs. We now correct this assignment and show that Crenothrix encode a typical gamma-proteobacterial PmoA. Stable isotope labeling in combination swith single-cell imaging mass spectrometry revealed methane-dependent growth of the lacustrine Crenothrix with oxygen as well as under oxygen-deficient conditions. Crenothrix genomes encoded pathways for the respiration of oxygen as well as for the reduction of nitrate to N 2 O. The observed abundance and planktonic growth of Crenothrix suggest that these methanotrophs can act as a relevant biological sink for methane in stratified lakes and should be considered in the context of environmental removal of methane.

  3. LONGITUDINAL OSCILLATIONS IN DENSITY STRATIFIED AND EXPANDING SOLAR WAVEGUIDES

    Energy Technology Data Exchange (ETDEWEB)

    Luna-Cardozo, M. [Instituto de Astronomia y Fisica del Espacio, CONICET-UBA, CC. 67, Suc. 28, 1428 Buenos Aires (Argentina); Verth, G. [School of Computing, Engineering and Information Sciences, Northumbria University, Newcastle Upon Tyne NE1 8ST (United Kingdom); Erdelyi, R., E-mail: mluna@iafe.uba.ar, E-mail: robertus@sheffield.ac.uk, E-mail: gary.verth@northumbria.ac.uk [Solar Physics and Space Plasma Research Centre (SP2RC), University of Sheffield, Hicks Building, Hounsfield Road, Sheffield S3 7RH (United Kingdom)

    2012-04-01

    Waves and oscillations can provide vital information about the internal structure of waveguides in which they propagate. Here, we analytically investigate the effects of density and magnetic stratification on linear longitudinal magnetohydrodynamic (MHD) waves. The focus of this paper is to study the eigenmodes of these oscillations. It is our specific aim to understand what happens to these MHD waves generated in flux tubes with non-constant (e.g., expanding or magnetic bottle) cross-sectional area and density variations. The governing equation of the longitudinal mode is derived and solved analytically and numerically. In particular, the limit of the thin flux tube approximation is examined. The general solution describing the slow longitudinal MHD waves in an expanding magnetic flux tube with constant density is found. Longitudinal MHD waves in density stratified loops with constant magnetic field are also analyzed. From analytical solutions, the frequency ratio of the first overtone and fundamental mode is investigated in stratified waveguides. For small expansion, a linear dependence between the frequency ratio and the expansion factor is found. From numerical calculations it was found that the frequency ratio strongly depends on the density profile chosen and, in general, the numerical results are in agreement with the analytical results. The relevance of these results for solar magneto-seismology is discussed.

  4. Random forcing of geostrophic motion in rotating stratified turbulence

    Science.gov (United States)

    Waite, Michael L.

    2017-12-01

    Random forcing of geostrophic motion is a common approach in idealized simulations of rotating stratified turbulence. Such forcing represents the injection of energy into large-scale balanced motion, and the resulting breakdown of quasi-geostrophic turbulence into inertia-gravity waves and stratified turbulence can shed light on the turbulent cascade processes of the atmospheric mesoscale. White noise forcing is commonly employed, which excites all frequencies equally, including frequencies much higher than the natural frequencies of large-scale vortices. In this paper, the effects of these high frequencies in the forcing are investigated. Geostrophic motion is randomly forced with red noise over a range of decorrelation time scales τ, from a few time steps to twice the large-scale vortex time scale. It is found that short τ (i.e., nearly white noise) results in about 46% more gravity wave energy than longer τ, despite the fact that waves are not directly forced. We argue that this effect is due to wave-vortex interactions, through which the high frequencies in the forcing are able to excite waves at their natural frequencies. It is concluded that white noise forcing should be avoided, even if it is only applied to the geostrophic motion, when a careful investigation of spontaneous wave generation is needed.

  5. Improvements to TRAC models of condensing stratified flow. Pt. 1

    International Nuclear Information System (INIS)

    Zhang, Q.; Leslie, D.C.

    1991-12-01

    Direct contact condensation in stratified flow is an important phenomenon in LOCA analyses. In this report, the TRAC interfacial heat transfer model for stratified condensing flow has been assessed against the Bankoff experiments. A rectangular channel option has been added to the code to represent the experimental geometry. In almost all cases the TRAC heat transfer coefficient (HTC) over-predicts the condensation rates and in some cases it is so high that the predicted steam is sucked in from the normal outlet in order to conserve mass. Based on their cocurrent and countercurrent condensing flow experiments, Bankoff and his students (Lim 1981, Kim 1985) developed HTC models from the two cases. The replacement of the TRAC HTC with either of Bankoff's models greatly improves the predictions of condensation rates in the experiment with cocurrent condensing flow. However, the Bankoff HTC for countercurrent flow is preferable because it is based only on the local quantities rather than on the quantities averaged from the inlet. (author)

  6. Internal circle uplifts, transversality and stratified G-structures

    Energy Technology Data Exchange (ETDEWEB)

    Babalic, Elena Mirela [Department of Theoretical Physics, National Institute of Physics and Nuclear Engineering,Str. Reactorului no.30, P.O.BOX MG-6, Postcode 077125, Bucharest-Magurele (Romania); Department of Physics, University of Craiova,13 Al. I. Cuza Str., Craiova 200585 (Romania); Lazaroiu, Calin Iuliu [Center for Geometry and Physics, Institute for Basic Science,Pohang 790-784 (Korea, Republic of)

    2015-11-24

    We study stratified G-structures in N=2 compactifications of M-theory on eight-manifolds M using the uplift to the auxiliary nine-manifold M̂=M×S{sup 1}. We show that the cosmooth generalized distribution D̂ on M̂ which arises in this formalism may have pointwise transverse or non-transverse intersection with the pull-back of the tangent bundle of M, a fact which is responsible for the subtle relation between the spinor stabilizers arising on M and M̂ and for the complicated stratified G-structure on M which we uncovered in previous work. We give a direct explanation of the latter in terms of the former and relate explicitly the defining forms of the SU(2) structure which exists on the generic locus U of M to the defining forms of the SU(3) structure which exists on an open subset Û of M̂, thus providing a dictionary between the eight- and nine-dimensional formalisms.

  7. A modified stratified model for the 3C 273 jet

    International Nuclear Information System (INIS)

    Liu Wenpo; Shen Zhiqiang

    2009-01-01

    We present a modified stratified jet model to interpret the observed spectral energy distributions of knots in the 3C 273 jet. Based on the hypothesis of the single index of the particle energy spectrum at injection and identical emission processes among all the knots, the observed difference of spectral shape among different 3C 273 knots can be understood as a manifestation of the deviation of the equivalent Doppler factor of stratified emission regions in an individual knot from a characteristic one. The summed spectral energy distributions of all ten knots in the 3C 273 jet can be well fitted by two components: a low-energy component (radio to optical) dominated by synchrotron radiation and a high-energy component (UV, X-ray and γ-ray) dominated by inverse Compton scattering of the cosmic microwave background. This gives a consistent spectral index of α = 0.88 (S v ∝ v -α ) and a characteristic Doppler factor of 7.4. Assuming the average of the summed spectrum as the characteristic spectrum of each knot in the 3C 273 jet, we further get a distribution of Doppler factors. We discuss the possible implications of these results for the physical properties in the 3C 273 jet. Future GeV observations with GLAST could separate the γ-ray emission of 3C 273 from the large scale jet and the small scale jet (i.e. the core) through measuring the GeV spectrum.

  8. STRESS DISTRIBUTION IN THE STRATIFIED MASS CONTAINING VERTICAL ALVEOLE

    Directory of Open Access Journals (Sweden)

    Bobileva Tatiana Nikolaevna

    2017-08-01

    Full Text Available Almost all subsurface rocks used as foundations for various types of structures are stratified. Such heterogeneity may cause specific behaviour of the materials under strain. Differential equations describing the behaviour of such materials contain rapidly fluctuating coefficients, in view of this, solution of such equations is more time-consuming when using today’s computers. The method of asymptotic averaging leads to getting homogeneous medium under study to averaged equations with fixed factors. The present article is concerned with stratified soil mass consisting of pair-wise alternative isotropic elastic layers. In the results of elastic modules averaging, the present soil mass with horizontal rock stratification is simulated by homogeneous transversal-isotropic half-space with isotropy plane perpendicular to the standing axis. Half-space is loosened by a vertical alveole of circular cross-section, and virgin ground is under its own weight. For horizontal parting planes of layers, the following two types of surface conditions are set: ideal contact and backlash without cleavage. For homogeneous transversal-isotropic half-space received with a vertical alveole, the analytical solution of S.G. Lekhnitsky, well known in scientific papers, is used. The author gives expressions for stress components and displacements in soil mass for different marginal conditions on the alveole surface. Such research problems arise when constructing and maintaining buildings and when composite materials are used.

  9. Measuring mixing efficiency in experiments of strongly stratified turbulence

    Science.gov (United States)

    Augier, P.; Campagne, A.; Valran, T.; Calpe Linares, M.; Mohanan, A. V.; Micard, D.; Viboud, S.; Segalini, A.; Mordant, N.; Sommeria, J.; Lindborg, E.

    2017-12-01

    Oceanic and atmospheric models need better parameterization of the mixing efficiency. Therefore, we need to measure this quantity for flows representative of geophysical flows, both in terms of types of flows (with vortices and/or waves) and of dynamical regimes. In order to reach sufficiently large Reynolds number for strongly stratified flows, experiments for which salt is used to produce the stratification have to be carried out in a large rotating platform of at least 10-meter diameter.We present new experiments done in summer 2017 to study experimentally strongly stratified turbulence and mixing efficiency in the Coriolis platform. The flow is forced by a slow periodic movement of an array of large vertical or horizontal cylinders. The velocity field is measured by 3D-2C scanned horizontal particles image velocimetry (PIV) and 2D vertical PIV. Six density-temperature probes are used to measure vertical and horizontal profiles and signals at fixed positions.We will show how we rely heavily on open-science methods for this study. Our new results on the mixing efficiency will be presented and discussed in terms of mixing parameterization.

  10. Optimal energy growth in a stably stratified shear flow

    Science.gov (United States)

    Jose, Sharath; Roy, Anubhab; Bale, Rahul; Iyer, Krithika; Govindarajan, Rama

    2018-02-01

    Transient growth of perturbations by a linear non-modal evolution is studied here in a stably stratified bounded Couette flow. The density stratification is linear. Classical inviscid stability theory states that a parallel shear flow is stable to exponentially growing disturbances if the Richardson number (Ri) is greater than 1/4 everywhere in the flow. Experiments and numerical simulations at higher Ri show however that algebraically growing disturbances can lead to transient amplification. The complexity of a stably stratified shear flow stems from its ability to combine this transient amplification with propagating internal gravity waves (IGWs). The optimal perturbations associated with maximum energy amplification are numerically obtained at intermediate Reynolds numbers. It is shown that in this wall-bounded flow, the three-dimensional optimal perturbations are oblique, unlike in unstratified flow. A partitioning of energy into kinetic and potential helps in understanding the exchange of energies and how it modifies the transient growth. We show that the apportionment between potential and kinetic energy depends, in an interesting manner, on the Richardson number, and on time, as the transient growth proceeds from an optimal perturbation. The oft-quoted stabilizing role of stratification is also probed in the non-diffusive limit in the context of disturbance energy amplification.

  11. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  12. Experimental investigation and physical description of stratified flow in horizontal channels

    International Nuclear Information System (INIS)

    Staebler, T.

    2007-05-01

    The interaction between a liquid film and turbulent gas flows plays an important role in many technical applications (e.g. in hydraulic engineering, process engineering and nuclear engineering). The local kinematic and turbulent time-averaged flow quantities for counter-current stratified flows (supercritical and subcritical flows with and without flow reversal) have been measured for the first time. Therefore, the method of Particle Image Velocimetry was applied. By using fluorescent particles in combination with an optical filter it was possible to determine the flow quantities of the liquid phase up to the free surface. Additionally, the gaseous phase was investigated by using the scattering of light of conventional particles. With a further measurement technique the void fraction distribution along the channel height has been determined. For this purpose, a single-tip conductivity probe was developed. Furthermore, water delivery rates and pressure losses along the test section were measured over a wide range of parameters. The measurements also revealed new details on the hysteresis effect after the occurrence of flow reversal. The experimental findings were used to develop and validate a statistical model in which the liquid phase is considered to be an agglomeration of interacting particles. The statistical consideration of the particle interactions delivers a differential equation which can be used to predict the local void fraction distribution with the local turbulent kinematic energies of the liquid phase. Beyond that, an additional statistical description is presented in which the probability density functions of the local void fraction are described by beta-functions. Both theoretical approaches can be used for numerical modelling whereas the statistical model can be used to describe the phase interactions and the statistical description to describe the turbulent fluctuations of the local void fraction. Thus, this work has made available all necessary

  13. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  14. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  15. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  16. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  17. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  18. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  19. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  20. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  1. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  2. Reflection and transmission of electromagnetic waves in planarly stratified media

    International Nuclear Information System (INIS)

    Caviglia, G.

    1999-01-01

    Propagation of time-harmonic electromagnetic waves in planarly stratified multilayers is investigated. Each layer is allowed to be inhomogeneous and the layers are separated by interfaces. The procedure is based on the representation of the electromagnetic field in the basis of the eigenvectors of the matrix characterizing the first-order system. Hence the local reflection and transmission matrices are defined and the corresponding differential equations, in the pertinent space variable are determined. The jump conditions at interfaces are also established. The present model incorporates dissipative materials and the procedure holds without any restrictions to material symmetries. Differential equations appeared in the literature are shown to hold in particular (one-dimensional) cases or to represent homogeneous layers only

  3. Microstructure of Turbulence in the Stably Stratified Boundary Layer

    Science.gov (United States)

    Sorbjan, Zbigniew; Balsley, Ben B.

    2008-11-01

    The microstructure of a stably stratified boundary layer, with a significant low-level nocturnal jet, is investigated based on observations from the CASES-99 campaign in Kansas, U.S.A. The reported, high-resolution vertical profiles of the temperature, wind speed, wind direction, pressure, and the turbulent dissipation rate, were collected under nocturnal conditions on October 14, 1999, using the CIRES Tethered Lifting System. Two methods for evaluating instantaneous (1-sec) background profiles are applied to the raw data. The background potential temperature is calculated using the “bubble sort” algorithm to produce a monotonically increasing potential temperature with increasing height. Other scalar quantities are smoothed using a running vertical average. The behaviour of background flow, buoyant overturns, turbulent fluctuations, and their respective histograms are presented. Ratios of the considered length scales and the Ozmidov scale are nearly constant with height, a fact that can be applied in practice for estimating instantaneous profiles of the dissipation rate.

  4. Hydrodynamics of stratified epithelium: Steady state and linearized dynamics

    Science.gov (United States)

    Yeh, Wei-Ting; Chen, Hsuan-Yi

    2016-05-01

    A theoretical model for stratified epithelium is presented. The viscoelastic properties of the tissue are assumed to be dependent on the spatial distribution of proliferative and differentiated cells. Based on this assumption, a hydrodynamic description of tissue dynamics at the long-wavelength, long-time limit is developed, and the analysis reveals important insights into the dynamics of an epithelium close to its steady state. When the proliferative cells occupy a thin region close to the basal membrane, the relaxation rate towards the steady state is enhanced by cell division and cell apoptosis. On the other hand, when the region where proliferative cells reside becomes sufficiently thick, a flow induced by cell apoptosis close to the apical surface enhances small perturbations. This destabilizing mechanism is general for continuous self-renewal multilayered tissues; it could be related to the origin of certain tissue morphology, tumor growth, and the development pattern.

  5. A study of stratified gas-liquid pipe flow

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, George W.

    2005-07-01

    This work includes both theoretical modelling and experimental observations which are relevant to the design of gas condensate transport lines. Multicomponent hydrocarbon gas mixtures are transported in pipes over long distances and at various inclinations. Under certain circumstances, the heavier hydrocarbon components and/or water vapour condense to form one or more liquid phases. Near the desired capacity, the liquid condensate and water is efficiently transported in the form of a stratified flow with a droplet field. During operating conditions however, the flow rate may be reduced allowing liquid accumulation which can create serious operational problems due to large amounts of excess liquid being expelled into the receiving facilities during production ramp-up or even in steady production in severe cases. In particular, liquid tends to accumulate in upward inclined sections due to insufficient drag on the liquid from the gas. To optimize the transport of gas condensates, a pipe diameters should be carefully chosen to account for varying flow rates and pressure levels which are determined through the knowledge of the multiphase flow present. It is desirable to have a reliable numerical simulation tool to predict liquid accumulation for various flow rates, pipe diameters and pressure levels which is not presently accounted for by industrial flow codes. A critical feature of the simulation code would include the ability to predict the transition from small liquid accumulation at high flow rates to large liquid accumulation at low flow rates. A semi-intermittent flow regime of roll waves alternating with a partly backward flowing liquid film has been observed experimentally to occur for a range of gas flow rates. Most of the liquid is transported in the roll waves. The roll wave regime is not well understood and requires fundamental modelling and experimental research. The lack of reliable models for this regime leads to inaccurate prediction of the onset of

  6. Hydromagnetic stability of rotating stratified compressible fluid flows

    Energy Technology Data Exchange (ETDEWEB)

    Srinivasan, V; Kandaswamy, P [Dept. of Mathematics, Bharathiar University, Coimbatore, Tamil Nadu, India; Debnath, L [Dept. of Mathematics, University of Central Florida, Orlando, USA

    1984-09-01

    The hydromagnetic stability of a radially stratified compressible fluid rotating between two coaxial cylinders is investigated. The stability with respect to axisymmetric disturbances is examined. The fluid system is found to be thoroughly stable to axisymmetric disturbances provided the fluid rotates very rapidly. The system is shown to be unstable to non-axisymmetric disturbances, and the slow amplifying hydromagnetic wave modes propagate against the basic rotation. The lower and upper bounds of the azimuthal phase speeds of the amplifying waves are determined. A quadrant theorem on the slow waves characteristic of a rapidly rotating fluid is derived. Special attention is given to the effects of compressibility of the fluid. Some results concerning the stability of an incompressible fluid system are obtained as special cases of the present analysis.

  7. Direct numerical simulation of homogeneous stratified rotating turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Iida, O.; Tsujimura, S.; Nagano, Y. [Nagoya Institute of Technology, Department of Mech. Eng., Nagoya (Japan)

    2005-12-01

    The effects of the Prandtl number on stratified rotating turbulence have been studied in homogeneous turbulence by using direct numerical simulations and a rapid distortion theory. Fluctuations under strong stable-density stratification can be theoretically divided into the WAVE and the potential vorticity (PV) modes. In low-Prandtl-number fluids, the WAVE mode deteriorates, while the PV mode remains. Imposing rotation on a low-Prandtl-number fluid makes turbulence two-dimensional as well as geostrophic; it is found from the instantaneous turbulent structure that the vortices merge to form a few vertically-elongated vortex columns. During the period toward two-dimensionalization, the vertical vortices become asymmetric in the sense of rotation. (orig.)

  8. Advanced stratified charge rotary aircraft engine design study

    Science.gov (United States)

    Badgley, P.; Berkowitz, M.; Jones, C.; Myers, D.; Norwood, E.; Pratt, W. B.; Ellis, D. R.; Huggins, G.; Mueller, A.; Hembrey, J. H.

    1982-01-01

    A technology base of new developments which offered potential benefits to a general aviation engine was compiled and ranked. Using design approaches selected from the ranked list, conceptual design studies were performed of an advanced and a highly advanced engine sized to provide 186/250 shaft Kw/HP under cruise conditions at 7620/25,000 m/ft altitude. These are turbocharged, direct-injected stratified charge engines intended for commercial introduction in the early 1990's. The engine descriptive data includes tables, curves, and drawings depicting configuration, performance, weights and sizes, heat rejection, ignition and fuel injection system descriptions, maintenance requirements, and scaling data for varying power. An engine-airframe integration study of the resulting engines in advanced airframes was performed on a comparative basis with current production type engines. The results show airplane performance, costs, noise & installation factors. The rotary-engined airplanes display substantial improvements over the baseline, including 30 to 35% lower fuel usage.

  9. Internal combustion engine using premixed combustion of stratified charges

    Science.gov (United States)

    Marriott, Craig D [Rochester Hills, MI; Reitz, Rolf D [Madison, WI

    2003-12-30

    During a combustion cycle, a first stoichiometrically lean fuel charge is injected well prior to top dead center, preferably during the intake stroke. This first fuel charge is substantially mixed with the combustion chamber air during subsequent motion of the piston towards top dead center. A subsequent fuel charge is then injected prior to top dead center to create a stratified, locally richer mixture (but still leaner than stoichiometric) within the combustion chamber. The locally rich region within the combustion chamber has sufficient fuel density to autoignite, and its self-ignition serves to activate ignition for the lean mixture existing within the remainder of the combustion chamber. Because the mixture within the combustion chamber is overall premixed and relatively lean, NO.sub.x and soot production are significantly diminished.

  10. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  11. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  12. Visualization periodic flows in a continuously stratified fluid.

    Science.gov (United States)

    Bardakov, R.; Vasiliev, A.

    2012-04-01

    To visualize the flow pattern of viscous continuously stratified fluid both experimental and computational methods were developed. Computational procedures were based on exact solutions of set of the fundamental equations. Solutions of the problems of flows producing by periodically oscillating disk (linear and torsion oscillations) were visualized with a high resolutions to distinguish small-scale the singular components on the background of strong internal waves. Numerical algorithm of visualization allows to represent both the scalar and vector fields, such as velocity, density, pressure, vorticity, stream function. The size of the source, buoyancy and oscillation frequency, kinematic viscosity of the medium effects were traced in 2D an 3D posing problems. Precision schlieren instrument was used to visualize the flow pattern produced by linear and torsion oscillations of strip and disk in a continuously stratified fluid. Uniform stratification was created by the continuous displacement method. The buoyancy period ranged from 7.5 to 14 s. In the experiments disks with diameters from 9 to 30 cm and a thickness of 1 mm to 10 mm were used. Different schlieren methods that are conventional vertical slit - Foucault knife, vertical slit - filament (Maksoutov's method) and horizontal slit - horizontal grating (natural "rainbow" schlieren method) help to produce supplementing flow patterns. Both internal wave beams and fine flow components were visualized in vicinity and far from the source. Intensity of high gradient envelopes increased proportionally the amplitude of the source. In domains of envelopes convergence isolated small scale vortices and extended mushroom like jets were formed. Experiments have shown that in the case of torsion oscillations pattern of currents is more complicated than in case of forced linear oscillations. Comparison with known theoretical model shows that nonlinear interactions between the regular and singular flow components must be taken

  13. Longevity of Compositionally Stratified Layers in Ice Giants

    Science.gov (United States)

    Friedson, A. J.

    2017-12-01

    In the hydrogen-rich atmospheres of gas giants, a decrease with radius in the mixing ratio of a heavy species (e.g. He, CH4, H2O) has the potential to produce a density stratification that is convectively stable if the heavy species is sufficiently abundant. Formation of stable layers in the interiors of these planets has important implications for their internal structure, chemical mixing, dynamics, and thermal evolution, since vertical transport of heat and constituents in such layers is greatly reduced in comparison to that in convecting layers. Various processes have been suggested for creating compositionally stratified layers. In the interiors of Jupiter and Saturn, these include phase separation of He from metallic hydrogen and dissolution of dense core material into the surrounding metallic-H envelope. Condensation of methane and water has been proposed as a mechanism for producing stable zones in the atmospheres of Saturn and the ice giants. However, if a stably stratified layer is formed adjacent to an active region of convection, it may be susceptible to progressive erosion as the convection intrudes and entrains fluid into the unstable envelope. We discuss the principal factors that control the rate of entrainment and associated erosion and present a specific example concerning the longevity of stable layers formed by condensation of methane and water in Uranus and Neptune. We also consider whether the temporal variability of such layers may engender episodic behavior in the release of the internal heat of these planets. This research is supported by a grant from the NASA Solar System Workings Program.

  14. Investigations on flow reversal in stratified horizontal flow

    International Nuclear Information System (INIS)

    Staebler, T.; Meyer, L.; Schulenberg, T.; Laurien, E.

    2005-01-01

    The phenomena of flow reversal in stratified flows are investigated in a horizontal channel with application to the Emergency Core Cooling System (ECCS) in Pressurized Water Reactors (PWR). In case of a Loss-of-Coolant-Accident (LOCA), coolant can be injected through a secondary pipe within the feeding line of the primary circuit, the so called hot leg, counter-currently to the steam flow. It is essential that the coolant reaches the reactor core to prevent overheating. Due to high temperatures in such accident scenarios, steam is generated in the core, which escapes from the reactor vessel through the hot leg. In case of sufficiently high steam flow rates, only a reduced amount of coolant or even no coolant will be delivered to the reactor core. The WENKA test facility at the Institute for Nuclear and Energy Technologies (IKET) at Forschungszentrum Karlsruhe is capable to investigate the fluid dynamics of two-phase flows in such scenarios. Water and air flow counter-currently in a horizontal channel made of clear acrylic glass to allow full optical access. Flow rates of water and air can be varied independently within a wide range. Once flow reversal sets in, a strong hysteresis effect must be taken into account. This was quantified during the present investigations. Local experimental data are needed to expand appropriate models on flow reversal in horizontal two-phase flow and to include them into numerical codes. Investigations are carried out by means of Particle Image Velocimetry (PIV) to obtain local flow velocities without disturbing the flow. Due to the wavy character of the flow, strong reflections at the interfacial area must be taken into account. Using fluorescent particles and an optical filter allows eliminating the reflections and recording only the signals of the particles. The challenges in conducting local investigations in stratified wavy flows by applying optical measurement techniques are discussed. Results are presented and discussed allowing

  15. Stratified flow model for convective condensation in an inclined tube

    International Nuclear Information System (INIS)

    Lips, Stéphane; Meyer, Josua P.

    2012-01-01

    Highlights: ► Convective condensation in an inclined tube is modelled. ► The heat transfer coefficient is the highest for about 20° below the horizontal. ► Capillary forces have a strong effect on the liquid–vapour interface shape. ► A good agreement between the model and the experimental results was observed. - Abstract: Experimental data are reported for condensation of R134a in an 8.38 mm inner diameter smooth tube in inclined orientations with a mass flux of 200 kg/m 2 s. Under these conditions, the flow is stratified and there is an optimum inclination angle, which leads to the highest heat transfer coefficient. There is a need for a model to better understand and predict the flow behaviour. In this paper, the state of the art of existing models of stratified two-phase flows in inclined tubes is presented, whereafter a new mechanistic model is proposed. The liquid–vapour distribution in the tube is determined by taking into account the gravitational and the capillary forces. The comparison between the experimental data and the model prediction showed a good agreement in terms of heat transfer coefficients and pressure drops. The effect of the interface curvature on the heat transfer coefficient has been quantified and has been found to be significant. The optimum inclination angle is due to a balance between an increase of the void fraction and an increase in the falling liquid film thickness when the tube is inclined downwards. The effect of the mass flux and the vapour quality on the optimum inclination angle has also been studied.

  16. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  17. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  18. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  19. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  20. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  1. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  2. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  3. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  4. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  5. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  6. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  7. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  8. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  9. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  10. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  11. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  12. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  13. Mobilifilum chasei: morphology and ecology of a spirochete from an intertidal stratified microbial mat community

    Science.gov (United States)

    Margulis, L.; Hinkle, G.; Stolz, J.; Craft, F.; Esteve, I.; Guerrero, R.

    1990-01-01

    Spirochetes were found in the lower anoxiphototrophic layer of a stratified microbial mat (North Pond, Laguna Figueroa, Baja California, Mexico). Ultra-structural analysis of thin sections of field samples revealed spirochetes approximately 0.25 micrometer in diameter with 10 or more periplasmic flagella, leading to the interpretation that these spirochetes bear 10 flagellar insertions on each end. Morphometric study showed these free-living spirochetes greatly resemble certain symbiotic ones, i.e., Borrelia and certain termite spirochetes, the transverse sections of which are presented here. The ultrastructure of this spirochete also resembles Hollandina and Diplocalyx (spirochetes symbiotic in arthropods) more than it does Spirochaeta, the well known genus of mud-dwelling spirochetes. The new spirochete was detected in mat material collected both in 1985 and in 1987. Unique morphology (i.e., conspicuous outer coat of inner membrane, large number of periplasmic flagella) and ecology prompt us to name a new free-living spirochete.

  14. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  15. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  16. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  17. Systematic sampling for suspended sediment

    Science.gov (United States)

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  18. Stratified flows with variable density: mathematical modelling and numerical challenges.

    Science.gov (United States)

    Murillo, Javier; Navas-Montilla, Adrian

    2017-04-01

    Stratified flows appear in a wide variety of fundamental problems in hydrological and geophysical sciences. They may involve from hyperconcentrated floods carrying sediment causing collapse, landslides and debris flows, to suspended material in turbidity currents where turbulence is a key process. Also, in stratified flows variable horizontal density is present. Depending on the case, density varies according to the volumetric concentration of different components or species that can represent transported or suspended materials or soluble substances. Multilayer approaches based on the shallow water equations provide suitable models but are not free from difficulties when moving to the numerical resolution of the governing equations. Considering the variety of temporal and spatial scales, transfer of mass and energy among layers may strongly differ from one case to another. As a consequence, in order to provide accurate solutions, very high order methods of proved quality are demanded. Under these complex scenarios it is necessary to observe that the numerical solution provides the expected order of accuracy but also converges to the physically based solution, which is not an easy task. To this purpose, this work will focus in the use of Energy balanced augmented solvers, in particular, the Augmented Roe Flux ADER scheme. References: J. Murillo , P. García-Navarro, Wave Riemann description of friction terms in unsteady shallow flows: Application to water and mud/debris floods. J. Comput. Phys. 231 (2012) 1963-2001. J. Murillo B. Latorre, P. García-Navarro. A Riemann solver for unsteady computation of 2D shallow flows with variable density. J. Comput. Phys.231 (2012) 4775-4807. A. Navas-Montilla, J. Murillo, Energy balanced numerical schemes with very high order. The Augmented Roe Flux ADER scheme. Application to the shallow water equations, J. Comput. Phys. 290 (2015) 188-218. A. Navas-Montilla, J. Murillo, Asymptotically and exactly energy balanced augmented flux

  19. Deep silicon maxima in the stratified oligotrophic Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Y. Crombet

    2011-02-01

    Full Text Available The silicon biogeochemical cycle has been studied in the Mediterranean Sea during late summer/early autumn 1999 and summer 2008. The distribution of nutrients, particulate carbon and silicon, fucoxanthin (Fuco, and total chlorophyll-a (TChl-a were investigated along an eastward gradient of oligotrophy during two cruises (PROSOPE and BOUM encompassing the entire Mediterranean Sea during the stratified period. At both seasons, surface waters were depleted in nutrients and the nutriclines gradually deepened towards the East, the phosphacline being the deepest in the easternmost Levantine basin. Following the nutriclines, parallel deep maxima of biogenic silica (DSM, fucoxanthin (DFM and TChl-a (DCM were evidenced during both seasons with maximal concentrations of 0.45 μmol L−1 for BSi, 0.26 μg L−1 for Fuco, and 1.70 μg L−1 for TChl-a, all measured during summer. Contrary to the DCM which was a persistent feature in the Mediterranean Sea, the DSM and DFMs were observed in discrete areas of the Alboran Sea, the Algero-Provencal basin, the Ionian sea and the Levantine basin, indicating that diatoms were able to grow at depth and dominate the DCM under specific conditions. Diatom assemblages were dominated by Chaetoceros spp., Leptocylindrus spp., Pseudonitzschia spp. and the association between large centric diatoms (Hemiaulus hauckii and Rhizosolenia styliformis and the cyanobacterium Richelia intracellularis was observed at nearly all sites. The diatom's ability to grow at depth is commonly observed in other oligotrophic regions and could play a major role in ecosystem productivity and carbon export to depth. Contrary to the common view that Si and siliceous phytoplankton are not major components of the Mediterranean biogeochemistry, we suggest here that diatoms, by persisting at depth during the stratified period, could contribute to a

  20. Prevalence of masturbation and associated factors in a British national probability survey.

    Science.gov (United States)

    Gerressu, Makeda; Mercer, Catherine H; Graham, Cynthia A; Wellings, Kaye; Johnson, Anne M

    2008-04-01

    A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with reporting this behavior. Seventy-three percent of men and 36.8% of women reported masturbating in the 4 weeks prior to interview (95% confidence interval 71.5%-74.4% and 35.4%-38.2%, respectively). A number of sociodemographic and behavioral factors were associated with reporting masturbation. Among both men and women, reporting masturbation increased with higher levels of education and social class and was more common among those reporting sexual function problems. For women, masturbation was more likely among those who reported more frequent vaginal sex in the last four weeks, a greater repertoire of sexual activity (such as reporting oral and anal sex), and more sexual partners in the last year. In contrast, the prevalence of masturbation was lower among men reporting more frequent vaginal sex. Both men and women reporting same-sex partner(s) were significantly more likely to report masturbation. Masturbation is a common sexual practice with significant variations in reporting between men and women.

  1. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  2. Plume Splitting in a Two-layer Stratified Ambient Fluid

    Science.gov (United States)

    Ma, Yongxing; Flynn, Morris; Sutherland, Bruce

    2017-11-01

    A line-source plume descending into a two-layer stratified ambient fluid in a finite sized tank is studied experimentally. Although the total volume of ambient fluid is fixed, lower- and upper-layer fluids are respectively removed and added at a constant rate mimicking marine outfall through diffusers and natural and hybrid ventilated buildings. The influence of the plume on the ambient depends on the value of λ, defined as the ratio of the plume buoyancy to the buoyancy loss of the plume as it crosses the ambient interface. Similar to classical filling-box experiments, the plume can always reach the bottom of the tank if λ > 1 . By contrast, if λ < 1 , an intermediate layer eventually forms as a result of plume splitting. Eventually all of the plume fluid spreads within the intermediate layer. The starting time, tv, and the ending time, tt, of the transition process measured from experiments correlate with the value of λ. A three-layer ambient fluid is observed after transition, and the mean value of the measured densities of the intermediate layer fluid is well predicted using plume theory. Acknowledgments: Funding for this study was provided by NSERC.

  3. Economic evaluation in stratified medicine: methodological issues and challenges

    Directory of Open Access Journals (Sweden)

    Hans-Joerg eFugel

    2016-05-01

    Full Text Available Background: Stratified Medicine (SM is becoming a practical reality with the targeting of medicines by using a biomarker or genetic-based diagnostic to identify the eligible patient sub-population. Like any healthcare intervention, SM interventions have costs and consequences that must be considered by reimbursement authorities with limited resources. Methodological standards and guidelines exist for economic evaluations in clinical pharmacology and are an important component for health technology assessments (HTAs in many countries. However, these guidelines have initially been developed for traditional pharmaceuticals and not for complex interventions with multiple components. This raises the issue as to whether these guidelines are adequate to SM interventions or whether new specific guidance and methodology is needed to avoid inconsistencies and contradictory findings when assessing economic value in SM.Objective: This article describes specific methodological challenges when conducting health economic (HE evaluations for SM interventions and outlines potential modifications necessary to existing evaluation guidelines /principles that would promote consistent economic evaluations for SM.Results/Conclusions: Specific methodological aspects for SM comprise considerations on the choice of comparator, measuring effectiveness and outcomes, appropriate modelling structure and the scope of sensitivity analyses. Although current HE methodology can be applied for SM, greater complexity requires further methodology development and modifications in the guidelines.

  4. Operations and Maintenance Cost for Stratified Buildings: A Critical Review

    Directory of Open Access Journals (Sweden)

    Che-Ghani Nor Zaimah

    2016-01-01

    Full Text Available Building maintenance is essential in preserving buildings’ appearance and performance. It needs to upkeep the building performance to prolong its value and building life cycle. Malaysia is still lacking in managing cost for building operation and maintenance. It has been found that the cost for housing maintenance is high due to poor maintenance practices. In order to get better understanding on how to manage the cost, this study reviews the contributing factors that affecting operation and maintenance cost of stratified buildings in Malaysia. The research first identified the factors through extensive literature review and scrutinize on factors that affecting and can minimize operation and maintenance cost. This literature review offers insight into building maintenance scenario in Malaysia focusing on the issues and challenges. The study also finds that operation and maintenance cost for housing in Malaysia is still in poor state. Interestingly, this paper revealed that operation and maintenance cost is also influenced by three significant factors like expectation of tenants, building characteristics and building defects. Measures to reduce the housing operation and maintenance cost are also highlighted so that this study can be a stepping stone towards proposing efficient and effective facilities management strategies for affordable housing in future.

  5. Stratified patterns of divorce: Earnings, education, and gender

    Directory of Open Access Journals (Sweden)

    Amit Kaplan

    2015-05-01

    Full Text Available Background: Despite evidence that divorce has become more prevalent among weaker socioeconomic groups, knowledge about the stratification aspects of divorce in Israel is lacking. Moreover, although scholarly debate recognizes the importance of stratificational positions with respect to divorce, less attention has been given to the interactions between them. Objective: Our aim is to examine the relationship between social inequality and divorce, focusing on how household income, education, employment stability, relative earnings, and the intersection between them affect the risk of divorce in Israel. Methods: The data is derived from combined census files for 1995-2008, annual administrative employment records from the National Insurance Institute and the Tax Authority, and data from the Civil Registry of Divorce. We used a series of discrete-time event-history analysis models for marital dissolution. Results: Couples in lower socioeconomic positions had a higher risk of divorce in Israel. Higher education in general, and homogamy in terms of higher education (both spouses have degrees in particular, decreased the risk of divorce. The wife's relative earnings had a differential effect on the likelihood of divorce, depending on household income: a wife who outearned her husband increased the log odds of divorce more in the upper tertiles than in the lower tertile. Conclusions: Our study shows that divorce indeed has a stratified pattern and that weaker socioeconomic groups experience the highest levels of divorce. Gender inequality within couples intersects with the household's economic and educational resources.

  6. Clinical research in small genomically stratified patient populations.

    Science.gov (United States)

    Martin-Liberal, J; Rodon, J

    2017-07-01

    The paradigm of early drug development in cancer is shifting from 'histology-oriented' to 'molecularly oriented' clinical trials. This change can be attributed to the vast amount of tumour biology knowledge generated by large international research initiatives such as The Cancer Genome Atlas (TCGA) and the use of next generation sequencing (NGS) techniques developed in recent years. However, targeting infrequent molecular alterations entails a series of special challenges. The optimal molecular profiling method, the lack of standardised biological thresholds, inter- and intra-tumor heterogeneity, availability of enough tumour material, correct clinical trials design, attrition rate, logistics or costs are only some of the issues that need to be taken into consideration in clinical research in small genomically stratified patient populations. This article examines the most relevant challenges inherent to clinical research in these populations. Moreover, perspectives from the Academia point of view are reviewed as well as initiatives to be taken in forthcoming years. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Stratifying the Risk of Venous Thromboembolism in Otolaryngology

    Science.gov (United States)

    Shuman, Andrew G.; Hu, Hsou Mei; Pannucci, Christopher J.; Jackson, Christopher R.; Bradford, Carol R.; Bahl, Vinita

    2015-01-01

    Objective The consequences of perioperative venous thromboembolism (VTE) are devastating; identifying patients at risk is an essential step in reducing morbidity and mortality. The utility of perioperative VTE risk assessment in otolaryngology is unknown. This study was designed to risk-stratify a diverse population of otolaryngology patients for VTE events. Study Design Retrospective cohort study. Setting Single-institution academic tertiary care medical center. Subjects and Methods Adult patients presenting for otolaryngologic surgery requiring hospital admission from 2003 to 2010 who did not receive VTE chemoprophylaxis were included. The Caprini risk assessment was retrospectively scored via a validated method of electronic chart abstraction. Primary study variables were Caprini risk scores and the incidence of perioperative venous thromboembolic outcomes. Results A total of 2016 patients were identified. The overall 30-day rate of VTE was 1.3%. The incidence of VTE in patients with a Caprini risk score of 6 or less was 0.5%. For patients with scores of 7 or 8, the incidence was 2.4%. Patients with a Caprini risk score greater than 8 had an 18.3% incidence of VTE and were significantly more likely to develop a VTE when compared to patients with a Caprini risk score less than 8 (P otolaryngology patients for 30-day VTE events and allows otolaryngologists to identify patient subgroups who have a higher risk of VTE in the absence of chemoprophylaxis. PMID:22261490

  8. Stratified charge rotary engine critical technology enablement. Volume 2: Appendixes

    Science.gov (United States)

    Irion, C. E.; Mount, R. E.

    1992-01-01

    This second volume of appendixes is a companion to Volume 1 of this report which summarizes results of a critical technology enablement effort with the stratified charge rotary engine (SCRE) focusing on a power section of 0.67 liters (40 cu. in.) per rotor in single and two rotor versions. The work is a continuation of prior NASA Contracts NAS3-23056 and NAS3-24628. Technical objectives are multi-fuel capability, including civil and military jet fuel and DF-2, fuel efficiency of 0.355 Lbs/BHP-Hr. at best cruise condition above 50 percent power, altitude capability of up to 10Km (33,000 ft.) cruise, 2000 hour TBO and reduced coolant heat rejection. Critical technologies for SCRE's that have the potential for competitive performance and cost in a representative light-aircraft environment were examined. Objectives were: the development and utilization of advanced analytical tools, i.e. higher speed and enhanced three dimensional combustion modeling; identification of critical technologies; development of improved instrumentation; and to isolate and quantitatively identify the contribution to performance and efficiency of critical components or subsystems. A family of four-stage third-order explicit Runge-Kutta schemes is derived that required only two locations and has desirable stability characteristics. Error control is achieved by embedding a second-order scheme within the four-stage procedure. Certain schemes are identified that are as efficient and accurate as conventional embedded schemes of comparable order and require fewer storage locations.

  9. Stratified Charge Rotary Engine Critical Technology Enablement, Volume 1

    Science.gov (United States)

    Irion, C. E.; Mount, R. E.

    1992-01-01

    This report summarizes results of a critical technology enablement effort with the stratified charge rotary engine (SCRE) focusing on a power section of 0.67 liters (40 cu. in.) per rotor in single and two rotor versions. The work is a continuation of prior NASA Contracts NAS3-23056 and NAS3-24628. Technical objectives are multi-fuel capability, including civil and military jet fuel and DF-2, fuel efficiency of 0.355 Lbs/BHP-Hr. at best cruise condition above 50 percent power, altitude capability of up to 10Km (33,000 ft.) cruise, 2000 hour TBO and reduced coolant heat rejection. Critical technologies for SCRE's that have the potential for competitive performance and cost in a representative light-aircraft environment were examined. Objectives were: the development and utilization of advanced analytical tools, i.e. higher speed and enhanced three dimensional combustion modeling; identification of critical technologies; development of improved instrumentation, and to isolate and quantitatively identify the contribution to performance and efficiency of critical components or subsystems.

  10. Layer contributions to the nonlinear acoustic radiation from stratified media.

    Science.gov (United States)

    Vander Meulen, François; Haumesser, Lionel

    2016-12-01

    This study presents the thorough investigation of the second harmonic generation scenario in a three fluid layer system. An emphasis is on the evaluation of the nonlinear parameter B/A in each layer from remote measurements. A theoretical approach of the propagation of a finite amplitude acoustic wave in a multilayered medium is developed. In the frame of the KZK equation, the weak nonlinearity of the media, attenuation and diffraction effects are computed for the fundamental and second harmonic waves propagating back and forth in each of the layers of the system. The model uses a gaussian expansion to describe the beam propagation in order to quantitatively evaluate the contribution of each part of the system (layers and interfaces) to its nonlinearity. The model is validated through measurements on a water/aluminum/water system. Transmission as well as reflection configurations are studied. Good agreement is found between the theoretical results and the experimental data. The analysis of the second harmonic field sources measured by the transducers from outside the stratified medium highlights the factors that favor the cumulative effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  12. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  13. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  14. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  16. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  17. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  18. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  19. Second law characterization of stratified thermal storage tanks

    Energy Technology Data Exchange (ETDEWEB)

    Fraidenraich, N [Departamento de Energia Nuclear-UFPE (Brazil)

    2000-07-01

    It is well known that fluid stratification in thermal storage tanks improves the overall performance of solar thermal systems, when compared with systems operating with uniform fluid temperature. From the point of view of the first law of thermodynamics, no difference exists between storage tanks with the same mass and average temperature, even if they have different stratified thermal structures. Nevertheless, the useful thermal energy that can be obtained from them might differ significantly. In this work, we derive an expression able to characterize the stratified configuration of thermal fluid. Using results obtained by thermodynamics of irreversible processes, the procedure adopted consists in calculating the maximum work available from the tank's thermal layer is able to develop. We arrive, then, at a dimensionless expression, the stratification parameter (SP), which depends on the mass fraction and absolute temperature of each thermal layer as well as the thermal fluid average temperature. Numerical examples for different types of tank stratification are given and it is verified that the expression obtained is sensitive to small differences in the reservoir thermal configuration. For example a thermal storage with temperatures equal to 74 Celsius degrees, 64 Celsius degrees and 54 Celsius degrees, with its mass equally distributed along the tank yields, for the parameter SP, a figure equal to 0.000294. On the other hand a storage tank with the same average temperature but with different layer's temperatures 76 Celsius degrees, 64 and 52 Celsius degrees, also with uniform mass distribution, yields for SP a value equal to quantitative evaluation of the stratification structure of thermal reservoirs. [Spanish] Es bien conocido que la estratificacion fluida en tanques de almacenamiento termico mejora el rendimiento total de los sistemas termicos solares en comparacion con sistemas que operan con temperatura uniforme del fluido. Desde el punto de vista

  20. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  1. Internal and vorticity waves in decaying stratified flows

    Science.gov (United States)

    Matulka, A.; Cano, D.

    2009-04-01

    Most predictive models fail when forcing at the Rossby deformation Radius is important and a large range of scales have to be taken into account. When mixing of reactants or pollutants has to be accounted, the range of scales spans from hundreds of Kilometers to the Bachelor or Kolmogorov sub milimiter scales. We present some theoretical arguments to describe the flow in terms of the three dimensional vorticity equations, using a lengthscale related to the vorticity (or enstrophy ) transport. Effect of intermittent eddies and non-homogeneity of diffusion are also key issues in the environment because both stratification and rotation body forces are important and cause anisotropy/non-homogeneity. These problems need further theoretical, numerical and observational work and one approach is to try to maximize the relevant geometrical information in order to understand and therefore predict these complex environmental dispersive flows. The importance of the study of turbulence structure and its relevance in diffusion of contaminants in environmental flows is clear when we see the effect of environmental disasters such as the Prestige oil spill or the Chernobil radioactive cloud spread in the atmosphere. A series of Experiments have been performed on a strongly stratified two layer fluid consisting of Brine in the bottom and freshwater above in a 1 square meter tank. The evolution of the vortices after the passage of a grid is video recorded and Particle tracking is applied on small pliolite particles floating at the interface. The combination of internal waves and vertical vorticity produces two separate time scales that may produce resonances. The vorticity is seen to oscilate in a complex way, where the frecuency decreases with time.

  2. Tumour vasculature immaturity, oxidative damage and systemic inflammation stratify survival of colorectal cancer patients on bevacizumab treatment

    Science.gov (United States)

    Martin, Petra; Biniecka, Monika; Ó'Meachair, Shane; Maguire, Aoife; Tosetto, Miriam; Nolan, Blathnaid; Hyland, John; Sheahan, Kieran; O'Donoghue, Diarmuid; Mulcahy, Hugh; Fennelly, David; O'Sullivan, Jacintha

    2018-01-01

    Despite treatment of patients with metastatic colorectal cancer (mCRC) with bevacizumab plus chemotherapy, response rates are modest and there are no biomarkers available that will predict response. The aim of this study was to assess if markers associated with three interconnected cancer-associated biological processes, specifically angiogenesis, inflammation and oxidative damage, could stratify the survival outcome of this cohort. Levels of angiogenesis, inflammation and oxidative damage markers were assessed in pre-bevacizumab resected tumour and serum samples of mCRC patients by dual immunofluorescence, immunohistochemistry and ELISA. This study identified that specific markers of angiogenesis, inflammation and oxidative damage stratify survival of patients on this anti-angiogenic treatment. Biomarkers of immature tumour vasculature (% IMM, p=0.026, n=80), high levels of oxidative damage in the tumour epithelium (intensity of 8-oxo-dG in nuclear and cytoplasmic compartments, p=0.042 and 0.038 respectively, n=75) and lower systemic pro-inflammatory cytokines (IL6 and IL8, p=0.053 and 0.049 respectively, n=61) significantly stratify with median overall survival (OS). In summary, screening for a panel of biomarkers for high levels of immature tumour vasculature, high levels of oxidative DNA damage and low levels of systemic pro-inflammatory cytokines may be beneficial in predicting enhanced survival outcome following bevacizumab treatment for mCRC. PMID:29535825

  3. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  4. Using the Internet to Support Exercise and Diet: A Stratified Norwegian Survey.

    Science.gov (United States)

    Wangberg, Silje C; Sørensen, Tove; Andreassen, Hege K

    2015-08-26

    Internet is used for a variety of health related purposes. Use differs and has differential effects on health according to socioeconomic status. We investigated to what extent the Norwegian population use the Internet to support exercise and diet, what kind of services they use, and whether there are social disparities in use. We expected to find differences according to educational attainment. In November 2013 we surveyed a stratified sample of 2196 persons drawn from a Web panel of about 50,000 Norwegians over 15 years of age. The questionnaire included questions about using the Internet, including social network sites (SNS), or mobile apps in relation to exercise or diet, as well as background information about education, body image, and health. The survey email was opened by 1187 respondents (54%). Of these, 89 did not click on the survey hyperlink (declined to participate), while another 70 did not complete the survey. The final sample size is thus 1028 (87% response rate). Compared to the Norwegian census the sample had a slight under-representation of respondents under the age of 30 and with low education. The data was weighted accordingly before analyses. Sixty-nine percent of women and 53% of men had read about exercise or diet on the Internet (χ(2)= 25.6, Psocial disparities in health, and continue to monitor population use. For Internet- and mobile-based interventions to support health behaviors, this study provides information relevant to tailoring of delivery media and components to user.

  5. Zooplankton structure and vertical migration: Using acoustics and biomass to compare stratified and mixed fjord systems

    Science.gov (United States)

    Díaz-Astudillo, Macarena; Cáceres, Mario A.; Landaeta, Mauricio F.

    2017-09-01

    The patterns of abundance, composition, biomass and vertical migration of zooplankton in short-time scales (ADCP device mounted on the hull of a ship were used to obtain vertical profiles of current velocity data and intensity of the backscattered acoustic signal, which was used to study the migratory strategies and to relate the echo intensity with zooplankton biomass. Repeated vertical profiles of temperature, salinity and density were obtained with a CTD instrument to describe the density patterns during both experiments. Zooplankton were sampled every 3 h using a Bongo net to determine abundance, composition and biomass. Migrations were diel in the stratified station, semi-diel in the mixed station, and controlled by light in both locations, with large and significant differences in zooplankton abundance and biomass between day and night samples. No migration pattern associated with the effect of tides was found. The depth of maximum backscatter strength showed differences of approximately 30 m between stations and was deeper in the mixed station. The relation between mean volume backscattering strength (dB) computed from echo intensity and log10 of total dry weight (mg m-3) of zooplankton biomass was moderate but significant in both locations. Biomass estimated from biological samples was higher in the mixed station and determined by euphausiids. Copepods were the most abundant group in both stations. Acoustic methods were a useful technique to understand the detailed patterns of migratory strategies of zooplankton and to help estimate zooplankton biomass and abundance in the inner waters of southern Chile.

  6. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  7. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  8. Simulation of Molten Core-Concrete Interaction in oxide/metal stratified configuration with the TOLBIAC-ICB code

    International Nuclear Information System (INIS)

    Tourniaire, B.; Spindler, B.

    2005-01-01

    The frame of this work is the validation of the TOLBIAC-ICB code which is devoted to the simulation of Molten Core-Concrete Interaction (MCCI) for reactor safety analysis. Attention focuses here on the validation of TOLBIAC-ICB in configurations expected to be representative of the long term phase of MCCI i.e. during an interaction between an oxide/metal stratified corium melt and a concrete structure. Up to now the BETA tests performed at the Forschungszentrum Karlsruhe (FzK) are the only tests available to study such kind of interaction. The BETA tests are first described and the operating conditions are reminded. The TOLBIAC-ICB code is then briefly described, with emphasis on the models used for stratified configurations. The results of the simulations are discussed. A sensitivity study is also performed with the power generated in the oxide layer instead of the metal layer as in the test. This last calculation shows that the large axial ablation observed in the tests is probably due to the peculiar configuration of the test with input power in the bottom metal layer. Since in the reactor case the residual power would be mainly concentrated in the upper oxide layer, the conclusions of the BETA tests for the reactor applications, in term of axial ablation, must be derived with caution. (author)

  9. Experimental investigation of stratified two-phase flows in the hot leg of a PWR for CFD validation

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Christophe; Lucas, Dirk [Helmholtz-Zentrum Dresden-Rossendorf (HZDR) e.V., Dresden (Germany). Inst. of Fluid Dynamics; Tomiyama, Akio [Kobe Univ (Japan). Graduate School of Engineering; Murase, Michio [Institute of Nuclear Safety System, Inc. (INSS), Fukui (Japan)

    2012-12-15

    Stratified 2-phase flows were investigated in 2 different models of the hot leg of a pressurised water reactor (PWR) in order to provide experimental data for the development and validation of computational fluid dynamics (CFD) codes. Therefore, the local flow structure was visualised with a high-speed video camera. Moreover, one test section was designed with a rectangular cross-section to achieve optimal observation conditions. The phenomenon of counter-current flow limitation (CCFL) was investigated, which may affect the reflux condenser cooling mode in some accident scenarios. The experiments were conducted with air and water at room temperature and maximum pressures of 3 bar as well as with steam and saturated water at boundary conditions of up to 50 bar and 264 C. The measured CCFL characteristics were compared with similar experimental data and correlations available in the literature. This shows that the channel height is the characteristic length to be used in the Wallis parameter for channels with rectangular cross-sections. Furthermore, the experimental results confirm that the Wallis similarity is appropriate to scale CCFL in the hot leg of a PWR over a wide range of pressure and temperature conditions. Finally, an image processing algorithm was developed to recognise the stratified interface in the camera frames. Subsequently, the interfacial structure along the hot leg was visualised by the representation of the probability distribution of the water level. (orig.)

  10. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  11. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  12. Exploring Middle School Students' Heuristic Thinking about Probability

    OpenAIRE

    Mistele, Jean May

    2014-01-01

    ABSTRACT This descriptive qualitative study examines six eighth-grade students' thinking while solving probability problems. This study aimed to gather direct information on students' problem solving processes informed by the heuristics and biases framework. This study used purposive sampling (Patton, 1990) to identify eighth-grade students who were knowledgeable about probability and had reached the formal operational stage of cognitive development. These criterion were necessary to redu...

  13. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  14. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  15. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  16. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  17. Experimental Validation of a Domestic Stratified Hot Water Tank Model in Modelica for Annual Performance Assessment

    DEFF Research Database (Denmark)

    Carmo, Carolina; Dumont, Olivier; Nielsen, Mads Pagh

    2015-01-01

    The use of stratified hot water tanks in solar energy systems - including ORC systems - as well as heat pump systems is paramount for a better performance of these systems. However, the availability of effective and reliable models to predict the annual performance of stratified hot water tanks...

  18. Properties of the endogenous post-stratified estimator using a random forests model

    Science.gov (United States)

    John Tipton; Jean Opsomer; Gretchen G. Moisen

    2012-01-01

    Post-stratification is used in survey statistics as a method to improve variance estimates. In traditional post-stratification methods, the variable on which the data is being stratified must be known at the population level. In many cases this is not possible, but it is possible to use a model to predict values using covariates, and then stratify on these predicted...

  19. Implementing content constraints in alpha-stratified adaptive testing using a shadow test approach

    NARCIS (Netherlands)

    van der Linden, Willem J.; Chang, Hua-Hua

    2001-01-01

    The methods of alpha-stratified adaptive testing and constrained adaptive testing with shadow tests are combined in this study. The advantages are twofold. First, application of the shadow test allows the researcher to implement any type of constraint on item selection in alpha-stratified adaptive

  20. Stratified turbulent Bunsen flames : flame surface analysis and flame surface density modelling

    NARCIS (Netherlands)

    Ramaekers, W.J.S.; Oijen, van J.A.; Goey, de L.P.H.

    2012-01-01

    In this paper it is investigated whether the Flame Surface Density (FSD) model, developed for turbulent premixed combustion, is also applicable to stratified flames. Direct Numerical Simulations (DNS) of turbulent stratified Bunsen flames have been carried out, using the Flamelet Generated Manifold

  1. Numerical study of the ignition behavior of a post-discharge kernel injected into a turbulent stratified cross-flow

    Science.gov (United States)

    Jaravel, Thomas; Labahn, Jeffrey; Ihme, Matthias

    2017-11-01

    The reliable initiation of flame ignition by high-energy spark kernels is critical for the operability of aviation gas turbines. The evolution of a spark kernel ejected by an igniter into a turbulent stratified environment is investigated using detailed numerical simulations with complex chemistry. At early times post ejection, comparisons of simulation results with high-speed Schlieren data show that the initial trajectory of the kernel is well reproduced, with a significant amount of air entrainment from the surrounding flow that is induced by the kernel ejection. After transiting in a non-flammable mixture, the kernel reaches a second stream of flammable methane-air mixture, where the successful of the kernel ignition was found to depend on the local flow state and operating conditions. By performing parametric studies, the probability of kernel ignition was identified, and compared with experimental observations. The ignition behavior is characterized by analyzing the local chemical structure, and its stochastic variability is also investigated.

  2. Turbulence Statistics of a Buoyant Jet in a Stratified Environment

    Science.gov (United States)

    McCleney, Amy Brooke

    Using non-intrusive optical diagnostics, turbulence statistics for a round, incompressible, buoyant, and vertical jet discharging freely into a stably linear stratified environment is studied and compared to a reference case of a neutrally buoyant jet in a uniform environment. This is part of a validation campaign for computational fluid dynamics (CFD). Buoyancy forces are known to significantly affect the jet evolution in a stratified environment. Despite their ubiquity in numerous natural and man-made flows, available data in these jets are limited, which constrain our understanding of the underlying physical processes. In particular, there is a dearth of velocity field data, which makes it challenging to validate numerical codes, currently used for modeling these important flows. Herein, jet near- and far-field behaviors are obtained with a combination of planar laser induced fluorescence (PLIF) and multi-scale time-resolved particle image velocimetry (TR-PIV) for Reynolds number up to 20,000. Deploying non-intrusive optical diagnostics in a variable density environment is challenging in liquids. The refractive index is strongly affected by the density, which introduces optical aberrations and occlusions that prevent the resolution of the flow. One solution consists of using index matched fluids with different densities. Here a pair of water solutions - isopropanol and NaCl - are identified that satisfy these requirements. In fact, they provide a density difference up to 5%, which is the largest reported for such fluid pairs. Additionally, by design, the kinematic viscosities of the solutions are identical. This greatly simplifies the analysis and subsequent simulations of the data. The spectral and temperature dependence of the solutions are fully characterized. In the near-field, shear layer roll-up is analyzed and characterized as a function of initial velocity profile. In the far-field, turbulence statistics are reported for two different scales, one

  3. Thermal stratification built up in hot water tank with different inlet stratifiers

    DEFF Research Database (Denmark)

    Dragsted, Janne; Furbo, Simon; Dannemand, Mark

    2017-01-01

    Thermal stratification in a water storage tank can strongly increase the thermal performance of solar heating systems. Thermal stratification can be built up in a storage tank during charge, if the heated water enters through an inlet stratifier. Experiments with a test tank have been carried out...... in order to elucidate how well thermal stratification is established in the tank with differently designed inlet stratifiers under different controlled laboratory conditions. The investigated inlet stratifiers are from Solvis GmbH & Co KG and EyeCular Technologies ApS. The inlet stratifier from Solvis Gmb...... for Solvis GmbH & Co KG had a better performance at 4 l/min. In the intermediate charge test the stratifier from EyeCular Technologies ApS had a better performance in terms of maintaining the thermal stratification in the storage tank while charging with a relative low temperature. [All rights reserved...

  4. Exploring the salivary microbiome of children stratified by the oral hygiene index

    Science.gov (United States)

    Mashima, Izumi; Theodorea, Citra F.; Thaweboon, Boonyanit; Thaweboon, Sroisiri; Scannapieco, Frank A.

    2017-01-01

    Poor oral hygiene often leads to chronic diseases such as periodontitis and dental caries resulting in substantial economic costs and diminished quality of life in not only adults but also in children. In this study, the salivary microbiome was characterized in a group of children stratified by the Simplified Oral Hygiene Index (OHI-S). Illumina MiSeq high-throughput sequencing based on the 16S rRNA was utilized to analyze 90 salivary samples (24 Good, 31 Moderate and 35 Poor oral hygiene) from a cohort of Thai children. A total of 38,521 OTUs (Operational Taxonomic Units) with a 97% similarity were characterized in all of the salivary samples. Twenty taxonomic groups (Seventeen genera, two families and one class; Streptococcus, Veillonella, Gemellaceae, Prevotella, Rothia, Porphyromonas, Granulicatella, Actinomyces, TM-7-3, Leptotrichia, Haemophilus, Selenomonas, Neisseria, Megasphaera, Capnocytophaga, Oribacterium, Abiotrophia, Lachnospiraceae, Peptostreptococcus, and Atopobium) were found in all subjects and constituted 94.5–96.5% of the microbiome. Of these twenty genera, the proportion of Streptococcus decreased while Veillonella increased with poor oral hygiene status (P oral hygiene group. This is the first study demonstrating an important association between increase of Veillonella and poor oral hygiene status in children. However, further studies are required to identify the majority of Veillonella at species level in salivary microbiome of the Poor oral hygiene group. PMID:28934367

  5. Characterisation of the suspended particulate matter in a stratified estuarine environment employing complementary techniques

    Science.gov (United States)

    Thomas, Luis P.; Marino, Beatriz M.; Szupiany, Ricardo N.; Gallo, Marcos N.

    2017-09-01

    The ability to predict the sediment and nutrient circulation within estuarine waters is of significant economic and ecological importance. In these complex systems, flocculation is a dynamically active process that is directly affected by the prevalent environmental conditions. Consequently, the floc properties continuously change, which greatly complicates the characterisation of the suspended particle matter (SPM). In the present study, three different techniques are combined in a stratified estuary under quiet weather conditions and with a low river discharge to search for a solution to this problem. The challenge is to obtain the concentration, size and flux of suspended elements through selected cross-sections using the method based on the simultaneous backscatter records of 1200 and 600 kHz ADCPs, isokinetic sampling data and LISST-25X measurements. The two-ADCP method is highly effective for determining the SPM size distributions in a non-intrusive way. The isokinetic sampling and the LISST-25X diffractometer offer point measurements at specific depths, which are especially useful for calibrating the ADCP backscatter intensity as a function of the SPM concentration and size, and providing complementary information on the sites where acoustic records are not available. Limitations and potentials of the techniques applied are discussed.

  6. Yield and quality of ground water from stratified-drift aquifers, Taunton River basin, Massachusetts : executive summary

    Science.gov (United States)

    Lapham, Wayne W.; Olimpio, Julio C.

    1989-01-01

    Water shortages are a chronic problem in parts of the Taunton River basin and are caused by a combination of factors. Water use in this part of the Boston metropolitan area is likely to increase during the next decade. The Massachusetts Division of Water Resources projects that about 50% of the cities and towns within and on the perimeter of the basin may have water supply deficits by 1990 if water management projects are not pursued throughout the 1980s. Estimates of the long-term yield of the 26 regional aquifers indicate that the yields of the two most productive aquifers equal or exceed 11.9 and 11.3 cu ft/sec, 90% of the time, respectively, if minimum stream discharge is maintained at 99.5% flow duration. Eighteen of the 26 aquifers were pumped for public water supply during 1983. Further analysis of the yield characteristics of these 18 aquifers indicates that the 1983 pumping rate of each of these 18 aquifers can be sustained at least 70% of the time. Selected physical properties and concentrations of major chemical constituents in groundwater from the stratified-drift aquifers at 80 sampling sites were used to characterize general water quality in aquifers throughout the basin. The pH of the groundwater ranged from 5.4 to 7.0. Natural elevated concentrations of Fe and Mn in water in the stratified-drift aquifers are present locally in the basin. Natural concentrations of these two metals commonly exceed the limits of 0.3 mg/L for Fe and 0.05 mg/L for Mn recommended for drinking water. Fifty-one analyses of selected trace metals in groundwater samples from stratified-drift aquifers throughout the basin were used to characterize trace metal concentrations in the groundwater. Of the 10 constituents sampled that have US EPA limits recommended for drinking water, only the Pb concentration in water at one site (60 micrograms/L) exceeded the recommended limit of 50 micrograms/L. Analyses of selected organic compounds in water in the stratified-drift aquifers at 74

  7. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  8. Audio feature extraction using probability distribution function

    Science.gov (United States)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  9. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  10. Indications for tonsillectomy stratified by the level of evidence

    Science.gov (United States)

    Windfuhr, Jochen P.

    2016-01-01

    Background: One of the most significant clinical trials, demonstrating the efficacy of tonsillectomy (TE) for recurrent throat infection in severely affected children, was published in 1984. This systematic review was undertaken to compile various indications for TE as suggested in the literature after 1984 and to stratify the papers according to the current concept of evidence-based medicine. Material and methods: A systematic Medline research was performed using the key word of “tonsillectomy“ in combination with different filters such as “systematic reviews“, “meta-analysis“, “English“, “German“, and “from 1984/01/01 to 2015/05/31“. Further research was performed in the Cochrane Database of Systematic Reviews, National Guideline Clearinghouse, Guidelines International Network and BMJ Clinical Evidence using the same key word. Finally, data from the “Trip Database” were researched for “tonsillectomy” and “indication“ and “from: 1984 to: 2015“ in combination with either “systematic review“ or “meta-analysis“ or “metaanalysis”. Results: A total of 237 papers were retrieved but only 57 matched our inclusion criteria covering the following topics: peritonsillar abscess (3), guidelines (5), otitis media with effusion (5), psoriasis (3), PFAPA syndrome (6), evidence-based indications (5), renal diseases (7), sleep-related breathing disorders (11), and tonsillitis/pharyngitis (12), respectively. Conclusions: 1) The literature suggests, that TE is not indicated to treat otitis media with effusion. 2) It has been shown, that the PFAPA syndrome is self-limiting and responds well to steroid administration, at least in a considerable amount of children. The indication for TE therefore appears to be imbalanced but further research is required to clarify the value of surgery. 3) Abscesstonsillectomy as a routine is not justified and indicated only for cases not responding to other measures of treatment, evident complications

  11. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  12. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  13. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  14. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  15. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  16. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  17. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  18. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  19. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  20. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  1. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  2. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  3. Rationale and Design of Khuzestan Vitamin D Deficiency Screening Program in Pregnancy: A Stratified Randomized Vitamin D Supplementation Controlled Trial.

    Science.gov (United States)

    Rostami, Maryam; Ramezani Tehrani, Fahimeh; Simbar, Masoumeh; Hosseinpanah, Farhad; Alavi Majd, Hamid

    2017-04-07

    Although there have been marked improvements in our understanding of vitamin D functions in different diseases, gaps on its role during pregnancy remain. Due to the lack of consensus on the most accurate marker of vitamin D deficiency during pregnancy and the optimal level of 25-hydroxyvitamin D, 25(OH)D, for its definition, vitamin D deficiency assessment during pregnancy is a complicated process. Besides, the optimal protocol for treatment of hypovitaminosis D and its effect on maternal and neonatal outcomes are still unclear. The aim of our study was to estimate the prevalence of vitamin D deficiency in the first trimester of pregnancy and to compare vitamin D screening strategy with no screening. Also, we intended to compare the effectiveness of various treatment regimens on maternal and neonatal outcomes in Masjed-Soleyman and Shushtar cities of Khuzestan province, Iran. This was a two-phase study. First, a population-based cross-sectional study was conducted; recruiting 1600 and 900 first trimester pregnant women from health centers of Masjed-Soleyman and Shushtar, respectively, using stratified multistage cluster sampling with probability proportional to size (PPS) method. Second, to assess the effect of screening strategy on maternal and neonatal outcomes, Masjed-Soleyman participants were assigned to a screening program versus Shushtar participants who became the nonscreening arm. Within the framework of the screening regimen, an 8-arm blind randomized clinical trial was undertaken to compare the effects of various treatment protocols. A total of 800 pregnant women with vitamin D deficiency were selected using simple random sampling from the 1600 individuals of Masjed-Soleyman as interventional groups. Serum concentrations of 25(OH)D were classified as: (1) severe deficient (20ng/ml). Those with severe and moderate deficiency were randomly divided into 4 subgroups and received vitamin D3 based on protocol and were followed until delivery. Data was analyzed

  4. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  5. Sampling and estimating recreational use.

    Science.gov (United States)

    Timothy G. Gregoire; Gregory J. Buhyoff

    1999-01-01

    Probability sampling methods applicable to estimate recreational use are presented. Both single- and multiple-access recreation sites are considered. One- and two-stage sampling methods are presented. Estimation of recreational use is presented in a series of examples.

  6. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    Energy Technology Data Exchange (ETDEWEB)

    Clarisse, Olivier [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)], E-mail: olivier.clarisse@umoncton.ca; Foucher, Delphine; Hintelmann, Holger [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)

    2009-03-15

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique.

  7. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    International Nuclear Information System (INIS)

    Clarisse, Olivier; Foucher, Delphine; Hintelmann, Holger

    2009-01-01

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

  8. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study.

    Science.gov (United States)

    Braithwaite, Jeffrey; Greenfield, David; Westbrook, Johanna; Pawsey, Marjorie; Westbrook, Mary; Gibberd, Robert; Naylor, Justine; Nathan, Sally; Robinson, Maureen; Runciman, Bill; Jackson, Margaret; Travaglia, Joanne; Johnston, Brian; Yen, Desmond; McDonald, Heather; Low, Lena; Redman, Sally; Johnson, Betty; Corbett, Angus; Hennessy, Darlene; Clark, John; Lancaster, Judie

    2010-02-01

    Despite the widespread use of accreditation in many countries, and prevailing beliefs that accreditation is associated with variables contributing to clinical care and organisational outcomes, little systematic research has been conducted to examine its validity as a predictor of healthcare performance. To determine whether accreditation performance is associated with self-reported clinical performance and independent ratings of four aspects of organisational performance. Independent blinded assessment of these variables in a random, stratified sample of health service organisations. Acute care: large, medium and small health-service organisations in Australia. Study participants Nineteen health service organisations employing 16 448 staff treating 321 289 inpatients and 1 971 087 non-inpatient services annually, representing approximately 5% of the Australian acute care health system. Correlations of accreditation performance with organisational culture, organisational climate, consumer involvement, leadership and clinical performance. Results Accreditation performance was significantly positively correlated with organisational culture (rho=0.618, p=0.005) and leadership (rho=0.616, p=0.005). There was a trend between accreditation and clinical performance (rho=0.450, p=0.080). Accreditation was unrelated to organisational climate (rho=0.378, p=0.110) and consumer involvement (rho=0.215, p=0.377). Accreditation results predict leadership behaviours and cultural characteristics of healthcare organisations but not organisational climate or consumer participation, and a positive trend between accreditation and clinical performance is noted.

  9. Bioenergetic evaluation of diel vertical migration by bull trout (Salvelinus confluentus) in a thermally stratified reservoir

    Science.gov (United States)

    Eckmann, Madeleine; Dunham, Jason B.; Connor, Edward J.; Welch, Carmen A.

    2018-01-01

    Many species living in deeper lentic ecosystems exhibit daily movements that cycle through the water column, generally referred to as diel vertical migration (DVM). In this study, we applied bioenergetics modelling to evaluate growth as a hypothesis to explain DVM by bull trout (Salvelinus confluentus) in a thermally stratified reservoir (Ross Lake, WA, USA) during the peak of thermal stratification in July and August. Bioenergetics model parameters were derived from observed vertical distributions of temperature, prey and bull trout. Field sampling confirmed that bull trout prey almost exclusively on recently introduced redside shiner (Richardsonius balteatus). Model predictions revealed that deeper (>25 m) DVMs commonly exhibited by bull trout during peak thermal stratification cannot be explained by maximising growth. Survival, another common explanation for DVM, may have influenced bull trout depth use, but observations suggest there may be additional drivers of DVM. We propose these deeper summertime excursions may be partly explained by an alternative hypothesis: the importance of colder water for gametogenesis. In Ross Lake, reliance of bull trout on warm water prey (redside shiner) for consumption and growth poses a potential trade-off with the need for colder water for gametogenesis.

  10. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  12. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  13. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  14. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  15. Probability of Alzheimer's disease in breast cancer survivors based on gray-matter structural network efficiency.

    Science.gov (United States)

    Kesler, Shelli R; Rao, Vikram; Ray, William J; Rao, Arvind

    2017-01-01

    Breast cancer chemotherapy is associated with accelerated aging and potentially increased risk for Alzheimer's disease (AD). We calculated the probability of AD diagnosis from brain network and demographic and genetic data obtained from 47 female AD converters and 47 matched healthy controls. We then applied this algorithm to data from 78 breast cancer survivors. The classifier discriminated between AD and healthy controls with 86% accuracy ( P  < .0001). Chemotherapy-treated breast cancer survivors demonstrated significantly higher probability of AD compared to healthy controls ( P  < .0001) and chemotherapy-naïve survivors ( P  = .007), even after stratifying for apolipoprotein e4 genotype. Chemotherapy-naïve survivors also showed higher AD probability compared to healthy controls ( P  = .014). Chemotherapy-treated breast cancer survivors who have a particular profile of brain structure may have a higher risk for AD, especially those who are older and have lower cognitive reserve.

  16. Monoplane 3D-2D registration of cerebral angiograms based on multi-objective stratified optimization

    Science.gov (United States)

    Aksoy, T.; Špiclin, Ž.; Pernuš, F.; Unal, G.

    2017-12-01

    Registration of 3D pre-interventional to 2D intra-interventional medical images has an increasingly important role in surgical planning, navigation and treatment, because it enables the physician to co-locate depth information given by pre-interventional 3D images with the live information in intra-interventional 2D images such as x-ray. Most tasks during image-guided interventions are carried out under a monoplane x-ray, which is a highly ill-posed problem for state-of-the-art 3D to 2D registration methods. To address the problem of rigid 3D-2D monoplane registration we propose a novel multi-objective stratified parameter optimization, wherein a small set of high-magnitude intensity gradients are matched between the 3D and 2D images. The stratified parameter optimization matches rotation templates to depth templates, first sampled from projected 3D gradients and second from the 2D image gradients, so as to recover 3D rigid-body rotations and out-of-plane translation. The objective for matching was the gradient magnitude correlation coefficient, which is invariant to in-plane translation. The in-plane translations are then found by locating the maximum of the gradient phase correlation between the best matching pair of rotation and depth templates. On twenty pairs of 3D and 2D images of ten patients undergoing cerebral endovascular image-guided intervention the 3D to monoplane 2D registration experiments were setup with a rather high range of initial mean target registration error from 0 to 100 mm. The proposed method effectively reduced the registration error to below 2 mm, which was further refined by a fast iterative method and resulted in a high final registration accuracy (0.40 mm) and high success rate (> 96%). Taking into account a fast execution time below 10 s, the observed performance of the proposed method shows a high potential for application into clinical image-guidance systems.

  17. Hydrogeology and water quality of the Nanticoke Creek stratified-drift aquifer, near Endicott, New York

    Science.gov (United States)

    Kreitinger, Elizabeth A.; Kappel, William M.

    2014-01-01

    The Village of Endicott, New York, is seeking an alternate source of public drinking water with the potential to supplement their current supply, which requires treatment due to legacy contamination. The southerly-draining Nanticoke Creek valley, located north of the village, was identified as a potential water source and the local stratified-drift (valley fill) aquifer was investigated to determine its hydrogeologic and water-quality characteristics. Nanticoke Creek and its aquifer extend from the hamlet of Glen Aubrey, N.Y., to the village of Endicott, a distance of about 15 miles, where it joins the Susquehanna River and its aquifer. The glacial sediments that comprise the stratified-drift aquifer vary in thickness and are generally underlain by glacial till over Devonian-aged shale and siltstone. Groundwater is more plentiful in the northern part of the aquifer where sand and gravel deposits are generally more permeable than in the southern part of the aquifer where less-permeable unconsolidated deposits are found. Generally there is enough groundwater to supply most homeowner wells and in some cases, supply small public-water systems such as schools, mobile-home parks, and small commercial/industrial facilities. The aquifer is recharged by precipitation, runoff, and tributary streams. Most tributary streams flowing across alluvial deposits lose water to the aquifer as they flow off of their bedrock-lined channels and into the more permeable alluvial deposits at the edges of the valley. The quality of both surface water and groundwater is generally good. Some water wells do have water-quality issues related to natural constituents (manganese and iron) and several homeowners noted either the smell and (or) taste of hydrogen sulfide in their drinking water. Dissolved methane concentrations from five drinking-water wells were well below the potentially explosive value of 28 milligrams per liter. Samples from surface and groundwater met nearly all State and Federal

  18. New numerical approaches for modeling thermochemical convection in a compositionally stratified fluid

    Science.gov (United States)

    Puckett, Elbridge Gerry; Turcotte, Donald L.; He, Ying; Lokavarapu, Harsha; Robey, Jonathan M.; Kellogg, Louise H.

    2018-03-01

    Geochemical observations of mantle-derived rocks favor a nearly homogeneous upper mantle, the source of mid-ocean ridge basalts (MORB), and heterogeneous lower mantle regions. Plumes that generate ocean island basalts are thought to sample the lower mantle regions and exhibit more heterogeneity than MORB. These regions have been associated with lower mantle structures known as large low shear velocity provinces (LLSVPS) below Africa and the South Pacific. The isolation of these regions is attributed to compositional differences and density stratification that, consequently, have been the subject of computational and laboratory modeling designed to determine the parameter regime in which layering is stable and understanding how layering evolves. Mathematical models of persistent compositional interfaces in the Earth's mantle may be inherently unstable, at least in some regions of the parameter space relevant to the mantle. Computing approximations to solutions of such problems presents severe challenges, even to state-of-the-art numerical methods. Some numerical algorithms for modeling the interface between distinct compositions smear the interface at the boundary between compositions, such as methods that add numerical diffusion or 'artificial viscosity' in order to stabilize the algorithm. We present two new algorithms for maintaining high-resolution and sharp computational boundaries in computations of these types of problems: a discontinuous Galerkin method with a bound preserving limiter and a Volume-of-Fluid interface tracking algorithm. We compare these new methods with two approaches widely used for modeling the advection of two distinct thermally driven compositional fields in mantle convection computations: a high-order accurate finite element advection algorithm with entropy viscosity and a particle method that carries a scalar quantity representing the location of each compositional field. All four algorithms are implemented in the open source finite

  19. Experimental investigation of droplet separation in a horizontal counter-current air/water stratified flow

    International Nuclear Information System (INIS)

    Gabriel, Stephan Gerhard

    2015-01-01

    ). Measurements in the gaseous phase were carried out by conventional oil-droplets as tracer particles. The volumetric phase distribution was investigated by the new OVM method, which was developed within this work. The validation of this method was done by simultaneous measurements of the new method and an electrical conductivity probe in the WENKA channel. Finally, the droplet mass flux was measured by an isokinetic sampling probe, which was also developed within this work. The functional capability of the probe and the accuracy of isokinetic conditions were demonstrated by PIV-measurements under various flow conditions. The investigations include both supercritical and subcritical, stratified flows, with partially and fully reversed conditions. The behavior of both fluids was analyzed at four measurement sites and under 31 different volumetric flux conditions. The results include sequences of images and numerical data, providing an accurate impression of the flow behavior in the channel. This dataset can now be used for the development and validation of new turbulence and phase interaction models for stratified counter-current two-phase flows.

  20. Economic viability of Stratified Medicine concepts : An investor perspective on drivers and conditions that favour using Stratified Medicine approaches in a cost-contained healthcare environment

    NARCIS (Netherlands)

    Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten

    2016-01-01

    RATIONALE: Stratified Medicine (SM) is becoming a natural result of advances in biomedical science and a promising path for the innovation-based biopharmaceutical industry to create new investment opportunities. While the use of biomarkers to improve R&D efficiency and productivity is very much