Øren Anita
2008-12-01
Full Text Available Abstract Background Prior studies on the impact of problem gambling in the family mainly include help-seeking populations with small numbers of participants. The objective of the present stratified probability sample study was to explore the epidemiology of problem gambling in the family in the general population. Methods Men and women 16–74 years-old randomly selected from the Norwegian national population database received an invitation to participate in this postal questionnaire study. The response rate was 36.1% (3,483/9,638. Given the lack of validated criteria, two survey questions ("Have you ever noticed that a close relative spent more and more money on gambling?" and "Have you ever experienced that a close relative lied to you about how much he/she gambles?" were extrapolated from the Lie/Bet Screen for pathological gambling. Respondents answering "yes" to both questions were defined as Concerned Significant Others (CSOs. Results Overall, 2.0% of the study population was defined as CSOs. Young age, female gender, and divorced marital status were factors positively associated with being a CSO. CSOs often reported to have experienced conflicts in the family related to gambling, worsening of the family's financial situation, and impaired mental and physical health. Conclusion Problematic gambling behaviour not only affects the gambling individual but also has a strong impact on the quality of life of family members.
PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY
Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci
2011-01-01
The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.
Bayesian Stratified Sampling to Assess Corpus Utility
Hochberg, J; Thomas, T; Hall, S; Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam
1998-01-01
This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents to fewer than 1000. The stratification is based on observed characteristics of real documents, while the sampling procedure incorporates a Bayesian version of Neyman allocation. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.
Stratified sampling design based on data mining.
Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung
2013-09-01
To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.
Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.
Mohamed, A.
1998-07-10
In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.
Deal, J. H.
1975-01-01
One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.
Global and Partial Errors in Stratified and Clustering Sampling
Giovanna Nicolini; Anna Lo Presti
2005-01-01
In this paper we split up the sampling error occurred in stratified and clustering sampling, called global error and measured by the variance of estimator, in many partial errors each one referred to a single stratum or cluster. In particular, we study, for clustering sampling, the empirical distribution of the homogeneity coefficient that is very important for settlement of partial errors.
On the Impact of Bootstrap in Stratified Random Sampling
LIU Cheng; ZHAO Lian-wen
2009-01-01
In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resampling method under some conditions. The determination of sample size by bootstrap method is also discussed, and a simulation is made to verify the accuracy of the proposed method. The simulation results show that the sample size based on bootstrapping is smaller than that based on central limit theorem.
Probability sampling design in ethnobotanical surveys of medicinal plants
Mariano Martinez Espinosa
2012-12-01
Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.
Sequential stratified sampling belief propagation for multiple targets tracking
无
2006-01-01
Rather than the difficulties of highly non-linear and non-Gaussian observation process and the state distribution in single target tracking, the presence of a large, varying number of targets and their interactions place more challenge on visual tracking. To overcome these difficulties, we formulate multiple targets tracking problem in a dynamic Markov network which consists of three coupled Markov random fields that model the following: a field for joint state of multi-target, one binary process for existence of individual target, and another binary process for occlusion of dual adjacent targets. By introducing two robust functions, we eliminate the two binary processes, and then apply a novel version of belief propagation called sequential stratified sampling belief propagation algorithm to obtain the maximum a posteriori (MAP) estimation in the dynamic Markov network. By using stratified sampler, we incorporate bottom-up information provided by a learned detector (e.g. SVM classifier) and belief information for the messages updating. Other low-level visual cues (e.g. color and shape) can be easily incorporated in our multi-target tracking model to obtain better tracking results. Experimental results suggest that our method is comparable to the state-of-the-art multiple targets tracking methods in several test cases.
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Approximation of Failure Probability Using Conditional Sampling
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Probabilities of exoplanet signals from posterior samplings
Tuomi, Mikko
2011-01-01
Estimating the marginal likelihoods is an essential feature of model selection in the Bayesian context. It is especially crucial to have good estimates when assessing the number of planets orbiting stars when the models explain the noisy data with different numbers of Keplerian signals. We introduce a simple method for approximating the marginal likelihoods in practice when a statistically representative sample from the parameter posterior density is available. We use our truncated posterior mixture estimate to receive accurate model probabilities for models with differing number of Keplerian signals in radial velocity data. We test this estimate in simple scenarios to assess its accuracy and rate of convergence in practice when the corresponding estimates calculated using deviance information criterion can be applied to receive trustworthy results for reliable comparison. As a test case, we determine the posterior probability of a planet orbiting HD 3651 given Lick and Keck radial velocity data. The posterio...
Atta Ullah
2014-01-01
Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.
A FAMILY OF ESTIMATORS FOR ESTIMATING POPULATION MEAN IN STRATIFIED SAMPLING UNDER NON-RESPONSE
Chaudhary, Manoj K.; RAJESH SINGH; Rakesh K. Shukla; MUKESH KUMAR; FLORENTIN SMARANDACHE
2015-01-01
Khoshnevisan et al. (2007) proposed a general family of estimators for population mean using known value of some population parameters in simple random sampling. The objective of this paper is to propose a family of combined-type estimators in stratified random sampling adapting the family of estimators proposed by Khoshnevisan et al. (2007) under non-response. The properties of proposed family have been discussed. We have also obtained the expressions for optimum sampl...
A FAMILY OF ESTIMATORS FOR ESTIMATING POPULATION MEAN IN STRATIFIED SAMPLING UNDER NON-RESPONSE
Chaudhary, Manoj K.; Rajesh Singh; Rakesh K. Shukla; Mukesh Kumar; Florentin Smarandache
2009-01-01
Khoshnevisan et al. (2007) proposed a general family of estimators for population mean using known value of some population parameters in simple random sampling. The objective of this paper is to propose a family of combined-type estimators in stratified random sampling adapting the family of estimators proposed by Khoshnevisan et al. (2007) under non-response. The properties of proposed family have been discussed. We have also obtained the expressions for optimum sample sizes of the strata i...
Hillson, Roger; Alejandre, Joel D; Jacobsen, Kathryn H; Ansumana, Rashid; Bockarie, Alfred S; Bangura, Umaru; Lamin, Joseph M; Stenger, David A
2015-01-01
There is a need for better estimators of population size in places that have undergone rapid growth and where collection of census data is difficult. We explored simulated estimates of urban population based on survey data from Bo, Sierra Leone, using two approaches: (1) stratified sampling from across 20 neighborhoods and (2) stratified single-stage cluster sampling of only four randomly-sampled neighborhoods. The stratification variables evaluated were (a) occupants per individual residence, (b) occupants per neighborhood, and (c) residential structures per neighborhood. For method (1), stratification variable (a) yielded the most accurate re-estimate of the current total population. Stratification variable (c), which can be estimated from aerial photography and zoning type verification, and variable (b), which could be ascertained by surveying a limited number of households, increased the accuracy of method (2). Small household-level surveys with appropriate sampling methods can yield reasonably accurate estimations of urban populations.
Brus, D.J.; Gruijter, de J.J.
2003-01-01
In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be
Brus, D.J.; Gruijter, de J.J.
2003-01-01
In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be in
Defining Sample Quantiles by the True Rank Probability
Lasse Makkonen
2014-01-01
Full Text Available Many definitions exist for sample quantiles and are included in statistical software. The need to adopt a standard definition of sample quantiles has been recognized and different definitions have been compared in terms of satisfying some desirable properties, but no consensus has been found. We outline here that comparisons of the sample quantile definitions are irrelevant because the probabilities associated with order-ranked sample values are known exactly. Accordingly, the standard definition for sample quantiles should be based on the true rank probabilities. We show that this allows more accurate inference of the tails of the distribution, and thus improves estimation of the probability of extreme events.
Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling
Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun
2011-01-01
Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…
A FAMILY OF ESTIMATORS FOR ESTIMATING POPULATION MEAN IN STRATIFIED SAMPLING UNDER NON-RESPONSE
Manoj K. Chaudhary
2009-01-01
Full Text Available Khoshnevisan et al. (2007 proposed a general family of estimators for population mean using known value of some population parameters in simple random sampling. The objective of this paper is to propose a family of combined-type estimators in stratified random sampling adapting the family of estimators proposed by Khoshnevisan et al. (2007 under non-response. The properties of proposed family have been discussed. We have also obtained the expressions for optimum sample sizes of the strata in respect to cost of the survey. Results are also supported by numerical analysis.
Sampling of quasidistributions, nonclassical behavior and negative probabilities
Peřina, J. [Department of Optics and Joint Laboratory of Optics of Palacký University and Institute of Physics of AS CR, Faculty of Science, Palacký University, 17. listopadu 12, 771 46 Olomouc (Czech Republic); Křepelka, J., E-mail: jaromir.krepelka@upol.cz [Joint Laboratory of Optics of Palacký University and Institute of Physics of AS CR, 17. listopadu 50a, 771 46 Olomouc (Czech Republic)
2016-05-20
Highlights: • Joint quasidistributions for parametric down conversion in nonclassical region are derived. • These quasidistributions are sampled with the use of the Shannon–Kotelnikov theorem. • The sampling is used for interpretation of negative probabilities in nonclassical region. - Abstract: We perform a sampling of the quasidistribution for the process of optical down-conversion in nonclassical regime, in which negative values of the quasidistribution are exhibited, using the Shannon–Kotelnikov sampling formula. We show that negative values of the quasidistribution do not directly represent probabilities, however, negative terms in the sampling formula related to the nonclassical behavior can be interpreted as positive probabilities in the negative orthogonal sinc-basis, whereas positive probabilities in the positive sinc-basis describe classical cases.
Kretzschmar, A; Durand, E; Maisonnasse, A; Vallon, J; Le Conte, Y
2015-06-01
A new procedure of stratified sampling is proposed in order to establish an accurate estimation of Varroa destructor populations on sticky bottom boards of the hive. It is based on the spatial sampling theory that recommends using regular grid stratification in the case of spatially structured process. The distribution of varroa mites on sticky board being observed as spatially structured, we designed a sampling scheme based on a regular grid with circles centered on each grid element. This new procedure is then compared with a former method using partially random sampling. Relative error improvements are exposed on the basis of a large sample of simulated sticky boards (n=20,000) which provides a complete range of spatial structures, from a random structure to a highly frame driven structure. The improvement of varroa mite number estimation is then measured by the percentage of counts with an error greater than a given level.
Brus, D.J.; Nieuwenhuizen, W.; Koomen, A.J.M.
2006-01-01
Seventy-two squares of 100 ha were selected by stratified random sampling with probabilities proportional to size (pps) to survey landscape changes in the period 1996-2003. The area of the plots times the urbanization pressure was used as a size measure. The central question of this study is whether
Quota Sampling as an Alternative to Probability Sampling? An Experimental Study
Yang, Keming; Banamah, Ahmad
2014-01-01
In spite of the establishment of probability sampling methods since the 1930s, non-probability sampling methods have remained popular among many commercial and polling agents, and they have also survived the embarrassment from a few incorrect predictions in American presidential elections. The increase of costs and the decline of response rates for administering probability samples have led some survey researchers to search for a non-probability sampling method as an alternative to probabilit...
Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling
Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun
2011-01-01
Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based,…
Nonprobability and probability-based sampling strategies in sexual science.
Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah
2015-01-01
With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.
Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling
Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun
2011-01-01
Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based,…
Brus, D J; de Gruijter, J J
2003-04-01
In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be increased by interpolating the values at the nonprobability sample points to the probability sample points, and using these interpolated values as an auxiliary variable in the difference or regression estimator. These estimators are (approximately) unbiased, even when the nonprobability sample is severely biased such as in preferential samples. The gain in precision compared to the pi estimator in combination with Simple Random Sampling is controlled by the correlation between the target variable and interpolated variable. This correlation is determined by the size (density) and spatial coverage of the nonprobability sample, and the spatial continuity of the target variable. In a case study the average ratio of the variances of the simple regression estimator and pi estimator was 0.68 for preferential samples of size 150 with moderate spatial clustering, and 0.80 for preferential samples of similar size with strong spatial clustering. In the latter case the simple regression estimator was substantially more precise than the simple difference estimator.
Tracing selection effects in three non-probability samples.
Barendregt, Cas; van der Poel, Agnes; van de Mheen, Dike
2005-01-01
Snowball sampling and targeted sampling are widely applied techniques to recruit samples from hidden populations, such as problematic drug users. The disadvantage is that they yield non-probability samples which cannot be generalised to the population. Despite thorough preparatory mapping procedures, selection effects continue to occur. This paper proposes an interpretation frame that allows estimating the direction of selection bias after data collection. Critical examination of the recruitment procedure and comparison with statistical and non-statistical external data sources are the core features of the interpretation frame. Applying the interpretation frame increases insight into the reliability of the results and allows to estimate where selection bias may have occurred.
Bounds for Tail Probabilities of the Sample Variance
V. Bentkus
2009-01-01
Full Text Available We provide bounds for tail probabilities of the sample variance. The bounds are expressed in terms of Hoeffding functions and are the sharpest known. They are designed having in mind applications in auditing as well as in processing data related to environment.
Estimating Income Variances by Probability Sampling: A Case Study
Akbar Ali Shah
2010-08-01
Full Text Available The main focus of the study is to estimate variability in income distribution of households by conducting a survey. The variances in income distribution have been calculated by probability sampling techniques. The variances are compared and relative gains are also obtained. It is concluded that the income distribution has been better as compared to first Household Income and Expenditure Survey (HIES conducted in Pakistan 1993-94.
A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin
Blaschek, Michael; Duttmann, Rainer
2015-04-01
The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using
Huang, S.R. [Feng Chia Univ., Taichung (Taiwan, Province of China). Electrical Engineering Dept.
1997-03-01
A combined Monte Carlo and optimum stratified sampling method is presented to better estimate copper loss of a transmission system during a prespecified future period. This design seeks to enhance the precision of copper loss of transmission system estimation, while reducing computation time. The techniques included are optimum stratified sampling and separate ratio estimation. The optimum stratification rule aims to remove any judgemental input and to render the stratification process entirely mechanistic. The estimator, provided by ratio statistics of the sample, can avoid identification of the regression model and thus save computation time. The effectiveness of precision improvement is demonstrated. (UK)
Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions
Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette
2016-01-01
We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found...... by assembling DNA from fragments (reads), locating a gene in this sequence and translating the gene to a protein. Sampling using this program generates random instance of the puzzle, but it is possible constrain the difficulty and to customize the secret protein word. Because of these constraints...... and the randomness of the generation process, sampling may fail to generate a satisfactory puzzle. To avoid failure we employ a strategy using adaptive probabilities which change in response to previous steps of generative process, thus minimizing the risk of failure....
Vollert, Jan; Maier, Christoph; Attal, Nadine; Bennett, David L.H.; Bouhassira, Didier; Enax-Krumova, Elena K.; Finnerup, Nanna B.; Freynhagen, Rainer; Gierthmühlen, Janne; Haanpää, Maija; Hansson, Per; Hüllemann, Philipp; Jensen, Troels S.; Magerl, Walter; Ramirez, Juan D.; Rice, Andrew S.C.; Schuh-Hofer, Sigrid; Segerdahl, Märta; Serra, Jordi; Shillo, Pallai R.; Sindrup, Soeren; Tesfaye, Solomon; Themistocleous, Andreas C.; Tölle, Thomas R.; Treede, Rolf-Detlef; Baron, Ralf
2017-01-01
Abstract In a recent cluster analysis, it has been shown that patients with peripheral neuropathic pain can be grouped into 3 sensory phenotypes based on quantitative sensory testing profiles, which are mainly characterized by either sensory loss, intact sensory function and mild thermal hyperalgesia and/or allodynia, or loss of thermal detection and mild mechanical hyperalgesia and/or allodynia. Here, we present an algorithm for allocation of individual patients to these subgroups. The algorithm is nondeterministic—ie, a patient can be sorted to more than one phenotype—and can separate patients with neuropathic pain from healthy subjects (sensitivity: 78%, specificity: 94%). We evaluated the frequency of each phenotype in a population of patients with painful diabetic polyneuropathy (n = 151), painful peripheral nerve injury (n = 335), and postherpetic neuralgia (n = 97) and propose sample sizes of study populations that need to be screened to reach a subpopulation large enough to conduct a phenotype-stratified study. The most common phenotype in diabetic polyneuropathy was sensory loss (83%), followed by mechanical hyperalgesia (75%) and thermal hyperalgesia (34%, note that percentages are overlapping and not additive). In peripheral nerve injury, frequencies were 37%, 59%, and 50%, and in postherpetic neuralgia, frequencies were 31%, 63%, and 46%. For parallel study design, either the estimated effect size of the treatment needs to be high (>0.7) or only phenotypes that are frequent in the clinical entity under study can realistically be performed. For crossover design, populations under 200 patients screened are sufficient for all phenotypes and clinical entities with a minimum estimated treatment effect size of 0.5. PMID:28595241
Petraki, Ioanna; Arkoudis, Chrisoula; Terzidis, Agis; Smyrnakis, Emmanouil; Benos, Alexis; Panagiotopoulos, Takis
2017-01-01
Abstract Background: Research on Roma health is fragmentary as major methodological obstacles often exist. Reliable estimates on vaccination coverage of Roma children at a national level and identification of risk factors for low coverage could play an instrumental role in developing evidence-based policies to promote vaccination in this marginalized population group. Methods: We carried out a national vaccination coverage survey of Roma children. Thirty Roma settlements, stratified by geographical region and settlement type, were included; 7–10 children aged 24–77 months were selected from each settlement using systematic sampling. Information on children’s vaccination coverage was collected from multiple sources. In the analysis we applied weights for each stratum, identified through a consensus process. Results: A total of 251 Roma children participated in the study. A vaccination document was presented for the large majority (86%). We found very low vaccination coverage for all vaccines. In 35–39% of children ‘minimum vaccination’ (DTP3 and IPV2 and MMR1) was administered, while 34–38% had received HepB3 and 31–35% Hib3; no child was vaccinated against tuberculosis in the first year of life. Better living conditions and primary care services close to Roma settlements were associated with higher vaccination indices. Conclusions: Our study showed inadequate vaccination coverage of Roma children in Greece, much lower than that of the non-minority child population. This serious public health challenge should be systematically addressed, or, amid continuing economic recession, the gap may widen. Valid national estimates on important characteristics of the Roma population can contribute to planning inclusion policies. PMID:27694159
Jian-Gao Fan; Xiao-Bu Cai; Lui Li; Xing-Jian Li; Fei Dai; Jun Zhu
2008-01-01
AIM: To examine the relations of alcohol consumption to the prevalence of metabolic syndrome in Shanghai adults.METHODS: We performed a cross-sectional analysis of data from the randomized multistage stratified cluster sampling of Shanghai adults, who were evaluated for alcohol consumption and each component of metabolic syndrome, using the adapted U.S. National Cholesterol Education Program criteria. Current alcohol consumption was defined as more than once of alcohol drinking per month.RESULTS: The study population consisted of 3953participants (1524 men) with a mean age of 54.3 ± 12.1years. Among them, 448 subjects (11.3%) were current alcohol drinkers, including 405 males and 43 females.After adjustment for age and sex, the prevalence of current alcohol drinking and metabolic syndrome in the general population of Shanghai was 13.0% and 15.3%,respectively. Compared with nondrinkers, the prevalence of hypertriglyceridemia and hypertension was higher while the prevalence of abdominal obesity, low serum high-density-lipoprotein cholesterol (HDL-C) and diabetes mellitus was lower in subjects who consumed alcohol twice or more per month, with a trend toward reducing the prevalence of metabolic syndrome. Among the current alcohol drinkers, systolic blood pressure, HDL-C, fasting plasma glucose, and prevalence of hypertriglyceridemia tended to increase with increased alcohol consumption.However, Iow-density-lipoprotein cholesterol concentration,prevalence of abdominal obesity, low serum HDL-C andmetabolic syndrome showed the tendency to decrease.Moreover, these statistically significant differences were independent of gender and age.CONCLUSION: Current alcohol consumption is associatedwith a lower prevalence of metabolic syndrome irrespe-ctive of alcohol intake (g/d), and has a favorable influence on HDL-C, waist circumference, and possible diabetes mellitus. However, alcohol intake increases the likelihoodof hypertension, hypertriglyceridemia and hyperglycemia
Khewal Bhupendra Kesur
2013-01-01
Full Text Available This paper examines the application of Latin Hypercube Sampling (LHS and Antithetic Variables (AVs to reduce the variance of estimated performance measures from microscopic traffic simulators. LHS and AV allow for a more representative coverage of input probability distributions through stratification, reducing the standard error of simulation outputs. Two methods of implementation are examined, one where stratification is applied to headways and routing decisions of individual vehicles and another where vehicle counts and entry times are more evenly sampled. The proposed methods have wider applicability in general queuing systems. LHS is found to outperform AV, and reductions of up to 71% in the standard error of estimates of traffic network performance relative to independent sampling are obtained. LHS allows for a reduction in the execution time of computationally expensive microscopic traffic simulators as fewer simulations are required to achieve a fixed level of precision with reductions of up to 84% in computing time noted on the test cases considered. The benefits of LHS are amplified for more congested networks and as the required level of precision increases.
Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B; Pereira, Nuno Sousa; Behrman, Jere
2012-05-01
Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization's Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples' statistical properties.
Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere
2011-01-01
Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004
Jung, Minsoo
2015-01-01
When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
2010-01-01
Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....
Tim A. Moore
2016-01-01
Full Text Available DOI: 10.17014/ijog.3.1.29-51Stratified sampling of coal seams for petrographic analysis using block samples is a viable alternative to standard methods of channel sampling and particulate pellet mounts. Although petrographic analysis of particulate pellets is employed widely, it is both time consuming and does not allow variation within sampling units to be assessed - an important measure in any study whether it be for paleoenvironmental reconstruction or in obtaining estimates of industrial attributes. Also, samples taken as intact blocks provide additional information, such as texture and botanical affinity that cannot be gained using particulate pellets. Stratified sampling can be employed both on ‘fine’ and ‘coarse’ grained coal units. Fine-grained coals are defined as those coal intervals that do not contain vitrain bands greater than approximately 1 mm in thickness (as measured perpendicular to bedding. In fine-grained coal seams, a reasonable sized block sample (with a polished surface area of ~3 cm2 can be taken that encapsulates the macroscopic variability. However, for coarse-grained coals (vitrain bands >1 mm a different system has to be employed in order to accurately account for the larger particles. Macroscopic point counting of vitrain bands can accurately account for those particles>1 mm within a coal interval. This point counting method is conducted using something as simple as string on a coal face with marked intervals greater than the largest particle expected to be encountered (although new technologies are being developed to capture this type of information digitally. Comparative analyses of particulate pellets and blocks on the same interval show less than 6% variation between the two sample types when blocks are recalculated to include macroscopic counts of vitrain. Therefore even in coarse-grained coals, stratified sampling can be used effectively and representatively.
Komada, Kenichi; Sugiyama, Masaya; Vongphrachanh, Phengta; Xeuatvongsa, Anonh; Khamphaphongphane, Bouaphan; Kitamura, Tomomi; Kiyohara, Tomoko; Wakita, Takaji; Oshitani, Hitoshi; Hachiya, Masahiko
2015-07-01
There is limited information regarding the prevalence of hepatitis B in Lao PDR, where the hepatitis disease burden is substantial. Thus, reliable seroprevalence data is needed for the disease, based on probability sampling. A stratified, multistage, cluster sampling survey of hepatitis B surface antigen (HBsAg) positivity among children aged 5-9 years and their mothers aged 15-45 years was conducted. Participants were selected randomly from the central region of Lao PDR via probability-proportional-to-size sampling. Blood samples were collected onto filter paper and subsequently analyzed using a chemiluminescent microparticle immunoassay. A total of 911 mother-and-child pairs were collected; the seroprevalence of HBsAg was estimated to be 2.1% (95% confidence interval 0.8-3.4%) among children and 4.1% (95% confidence interval 2.6-5.5%) in their mothers after taking into account the sampling design and the weight of each sample. The children's HBsAg positivity was positively associated with maternal infection and being born in a non-health facility, while the maternal infection status was not associated with any background characteristic. Lao PDR has a relatively lower HBsAg prevalence in the general population compared to surrounding countries. To ensure comparability to other countries and to future data, rapid field tests are recommended for a nationwide prevalence survey. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
For what applications can probability and non-probability sampling be used?
H. T. Schreuder; T. G. Gregoire; J. P. Weyer
2001-01-01
Almost any type of sample has some utility when estimating population quantities. The focus in this paper is to indicate what type or combination of types of sampling can be used in various situations ranging from a sample designed to establish cause-effect or legal challenge to one involving a simple subjective judgment. Several of these methods have little or no...
Probability Sampling - A Guideline for Quantitative Health Care ...
Sampling has received varied definitions by major authors on social ... A more direct definition is the method used for selecting a given number of people (or things) from a population (2). .... Internet based calculators provide immediate sample.
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
For what applications can probability and non-probability sampling be used?
Schreuder, H T; Gregoire, T G
2001-02-01
Almost any type of sample has some utility when estimating population quantities. The focus in this paper is to indicate what type or combination of types of sampling can be used in various situations ranging from a sample designed to establish cause-effect or legal challenge to one involving a simple subjective judgment. Several of these methods have little or no utility in the scientific area but even in the best of circumstances, particularly complex ones, both probabilistic and non-probabilistic procedures have to be used because of lack of knowledge and cost. We illustrate this with a marbled murrelet example.
Greil, Arthur L; McQuillan, Julia; Shreffler, Karina M; Johnson, Katherine M; Slauson-Blevins, Kathleen S
2011-12-01
Evidence of group differences in reproductive control and access to reproductive health care suggests the continued existence of "stratified reproduction" in the United States. Women of color are overrepresented among people with infertility but are underrepresented among those who receive medical services. The authors employ path analysis to uncover mechanisms accounting for these differences among black, Hispanic, Asian, and non-Hispanic white women using a probability-based sample of 2,162 U.S. women. Black and Hispanic women are less likely to receive services than other women. The enabling conditions of income, education, and private insurance partially mediate the relationship between race-ethnicity and receipt of services but do not fully account for the association at all levels of service. For black and Hispanic women, social cues, enabling conditions, and predisposing conditions contribute to disparities in receipt of services. Most of the association between race-ethnicity and service receipt is indirect rather than direct.
Data-driven probability concentration and sampling on manifold
Soize, C.; Ghanem, R.
2016-09-01
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.
Data-driven probability concentration and sampling on manifold
Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)
2016-09-15
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.
Batch Mode Active Sampling based on Marginal Probability Distribution Matching.
Chattopadhyay, Rita; Wang, Zheng; Fan, Wei; Davidson, Ian; Panchanathan, Sethuraman; Ye, Jieping
2012-01-01
Active Learning is a machine learning and data mining technique that selects the most informative samples for labeling and uses them as training data; it is especially useful when there are large amount of unlabeled data and labeling them is expensive. Recently, batch-mode active learning, where a set of samples are selected concurrently for labeling, based on their collective merit, has attracted a lot of attention. The objective of batch-mode active learning is to select a set of informative samples so that a classifier learned on these samples has good generalization performance on the unlabeled data. Most of the existing batch-mode active learning methodologies try to achieve this by selecting samples based on varied criteria. In this paper we propose a novel criterion which achieves good generalization performance of a classifier by specifically selecting a set of query samples that minimizes the difference in distribution between the labeled and the unlabeled data, after annotation. We explicitly measure this difference based on all candidate subsets of the unlabeled data and select the best subset. The proposed objective is an NP-hard integer programming optimization problem. We provide two optimization techniques to solve this problem. In the first one, the problem is transformed into a convex quadratic programming problem and in the second method the problem is transformed into a linear programming problem. Our empirical studies using publicly available UCI datasets and a biomedical image dataset demonstrate the effectiveness of the proposed approach in comparison with the state-of-the-art batch-mode active learning methods. We also present two extensions of the proposed approach, which incorporate uncertainty of the predicted labels of the unlabeled data and transfer learning in the proposed formulation. Our empirical studies on UCI datasets show that incorporation of uncertainty information improves performance at later iterations while our studies on 20
Lydia Leonardo
2012-01-01
Full Text Available For the first time in the country, a national baseline prevalence survey using a well-defined sampling design such as a stratified two-step systematic cluster sampling was conducted in 2005 to 2008. The purpose of the survey was to stratify the provinces according to prevalence of schistosomiasis such as high, moderate, and low prevalence which in turn would be used as basis for the intervention program to be implemented. The national survey was divided into four phases. Results of the first two phases conducted in Mindanao and the Visayas were published in 2008. Data from the last two phases showed three provinces with prevalence rates higher than endemic provinces surveyed in the first two phases thus changing the overall ranking of endemic provinces at the national level. Age and sex distribution of schistosomiasis remained the same in Luzon and Maguindanao. Soil-transmitted and food-borne helminthes were also recorded in these surveys. This paper deals with the results of the last 2 phases done in Luzon and Maguindanao and integrates all four phases in the discussion.
Sutor, Malinda M.; Dagg, Michael J.
2008-06-01
The effects of vertical sampling resolution on estimates of plankton biomass and grazing calculations were examined using data collected in two different areas with vertically stratified water columns. Data were collected from one site in the upwelling region off Oregon and from four sites in the Northern Gulf of Mexico, three within the Mississippi River plume and one in adjacent oceanic waters. Plankton were found to be concentrated in discrete layers with sharp vertical gradients at all the stations. Phytoplankton distributions were correlated with gradients in temperature and salinity, but microzooplankton and mesozooplankton distributions were not. Layers of zooplankton were sometimes collocated with layers of phytoplankton, but this was not always the case. Simulated calculations demonstrate that when averages are taken over the water column, or coarser scale vertical sampling resolution is used, biomass and mesozooplankton grazing and filtration rates can be greatly underestimated. This has important implications for understanding the ecological significance of discrete layers of plankton and for assessing rates of grazing and production in stratified water columns.
Leonardo, Lydia; Rivera, Pilarita; Saniel, Ofelia; Villacorte, Elena; Lebanan, May Antonnette; Crisostomo, Bobby; Hernandez, Leda; Baquilod, Mario; Erce, Edgardo; Martinez, Ruth; Velayudhan, Raman
2012-01-01
For the first time in the country, a national baseline prevalence survey using a well-defined sampling design such as a stratified two-step systematic cluster sampling was conducted in 2005 to 2008. The purpose of the survey was to stratify the provinces according to prevalence of schistosomiasis such as high, moderate, and low prevalence which in turn would be used as basis for the intervention program to be implemented. The national survey was divided into four phases. Results of the first two phases conducted in Mindanao and the Visayas were published in 2008. Data from the last two phases showed three provinces with prevalence rates higher than endemic provinces surveyed in the first two phases thus changing the overall ranking of endemic provinces at the national level. Age and sex distribution of schistosomiasis remained the same in Luzon and Maguindanao. Soil-transmitted and food-borne helminthes were also recorded in these surveys. This paper deals with the results of the last 2 phases done in Luzon and Maguindanao and integrates all four phases in the discussion.
The Estimation of Probability of Extreme Events for Small Samples
Pisarenko, V. F.; Rodkin, M. V.
2017-02-01
The most general approach to the study of rare extreme events is based on the extreme value theory. The fundamental General Extreme Value Distribution lies in the basis of this theory serving as the limit distribution for normalized maxima. It depends on three parameters. Usually the method of maximum likelihood (ML) is used for the estimation that possesses well-known optimal asymptotic properties. However, this method works efficiently only when sample size is large enough ( 200-500), whereas in many applications the sample size does not exceed 50-100. For such sizes, the advantage of the ML method in efficiency is not guaranteed. We have found that for this situation the method of statistical moments (SM) works more efficiently over other methods. The details of the estimation for small samples are studied. The SM is applied to the study of extreme earthquakes in three large virtual seismic zones, representing the regime of seismicity in subduction zones, intracontinental regime of seismicity, and the regime in mid-ocean ridge zones. The 68%-confidence domains for pairs of parameter (ξ, σ) and (σ, μ) are derived.
Importance Sampling for Failure Probabilities in Computing and Data Transmission
Asmussen, Søren
We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the c...... the computational effort is taken into account. To resolve this problem, an alternative algorithm using twosided Lundberg bounds is suggested....
Importance sampling for failure probabilities in computing and data transmission
Asmussen, Søren
2009-01-01
In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description...... these asymptotic descriptions have bounded relative error as x→∞ when combined with the ideas used for a fixed t. Nevertheless, we give examples of algorithms carefully designed to enjoy bounded relative error that may provide little or no asymptotic improvement over crude Monte Carlo simulation when...
A log rank type test in observational survival studies with stratified sampling.
Bai, Xiaofei; Tsiatis, Anastasios A
2016-04-01
In randomized clinical trials, the log rank test is often used to test the null hypothesis of the equality of treatment-specific survival distributions. In observational studies, however, the ordinary log rank test is no longer guaranteed to be valid. In such studies we must be cautious about potential confounders; that is, the covariates that affect both the treatment assignment and the survival distribution. In this paper, two cases were considered: the first is when it is believed that all the potential confounders are captured in the primary database, and the second case where a substudy is conducted to capture additional confounding covariates. We generalize the augmented inverse probability weighted complete case estimators for treatment-specific survival distribution proposed in Bai et al. (Biometrics 69:830-839, 2013) and develop the log rank type test in both cases. The consistency and double robustness of the proposed test statistics are shown in simulation studies. These statistics are then applied to the data from the observational study that motivated this research.
A. Martín Andrés
2015-01-01
Full Text Available The Mantel-Haenszel test is the most frequent asymptotic test used for analyzing stratified 2 × 2 tables. Its exact alternative is the test of Birch, which has recently been reconsidered by Jung. Both tests have a conditional origin: Pearson’s chi-squared test and Fisher’s exact test, respectively. But both tests have the same drawback that the result of global test (the stratified test may not be compatible with the result of individual tests (the test for each stratum. In this paper, we propose to carry out the global test using a multiple comparisons method (MC method which does not have this disadvantage. By refining the method (MCB method an alternative to the Mantel-Haenszel and Birch tests may be obtained. The new MC and MCB methods have the advantage that they may be applied from an unconditional view, a methodology which until now has not been applied to this problem. We also propose some sample size calculation methods.
Miller, Peter G; Johnston, Jennifer; Dunn, Matthew; Fry, Craig L; Degenhardt, Louisa
2010-02-01
The usage of Ecstasy and related drug (ERD) has increasingly been the focus of epidemiological and other public health-related research. One of the more promising methods is the use of the Internet as a recruitment and survey tool. However, there remain methodological concerns and questions about representativeness. Three samples of ERD users in Melbourne, Australia surveyed in 2004 are compared in terms of a number of key demographic and drug use variables. The Internet, face-to-face, and probability sampling methods appear to access similar but not identical groups of ERD users. Implications and limitations of the study are noted and future research is recommended.
Bhatia, Triptish; Gettig, Elizabeth A; Gottesman, Irving I; Berliner, Jonathan; Mishra, N N; Nimgaonkar, Vishwajit L; Deshpande, Smita N
2016-12-01
Schizophrenia (SZ) has an estimated heritability of 64-88%, with the higher values based on twin studies. Conventionally, family history of psychosis is the best individual-level predictor of risk, but reliable risk estimates are unavailable for Indian populations. Genetic, environmental, and epigenetic factors are equally important and should be considered when predicting risk in 'at risk' individuals. To estimate risk based on an Indian schizophrenia participant's family history combined with selected demographic factors. To incorporate variables in addition to family history, and to stratify risk, we constructed a regression equation that included demographic variables in addition to family history. The equation was tested in two independent Indian samples: (i) an initial sample of SZ participants (N=128) with one sibling or offspring; (ii) a second, independent sample consisting of multiply affected families (N=138 families, with two or more sibs/offspring affected with SZ). The overall estimated risk was 4.31±0.27 (mean±standard deviation). There were 19 (14.8%) individuals in the high risk group, 75 (58.6%) in the moderate risk and 34 (26.6%) in the above average risk (in Sample A). In the validation sample, risks were distributed as: high (45%), moderate (38%) and above average (17%). Consistent risk estimates were obtained from both samples using the regression equation. Familial risk can be combined with demographic factors to estimate risk for SZ in India. If replicated, the proposed stratification of risk may be easier and more realistic for family members. Copyright © 2016. Published by Elsevier B.V.
Predominantly Low Metallicities Measured in a Stratified Sample of Lyman Limit Systems at z=3.7
Glidden, Ana; Cooksey, Kathy L; Simcoe, Robert A; O'Meara, John M
2016-01-01
We analyzed metallicities for 33 z=3.4-4.2 absorption line systems with large neutral hydrogen column densities, drawn from a sample of H I-selected of Lyman limit systems (LLSs) identified in Sloan Digital Sky Survey (SDSS) quasar spectra, and stratified based on metal line features. We obtained higher-resolution spectra with the Keck Echellette Spectrograph and Imager (ESI), selecting targets according to our stratification scheme in an effort to fully sample the LLS population metallicity distribution. We established a plausible range of H I column densities and measured the metal column densities (or limits) for ions of carbon, silicon, and aluminum. With simulations, we found ionization-corrected metallicities or upper limits, when appropriate. Interestingly, our ionization models were better constrained with enhanced {\\alpha}-to-aluminum abundances, with a median abundance ratio of [{\\alpha}/Al]=0.3. Measured metallicities were generally low, ranging from [M/H]=-3 to -1.68, with even lower metallicities...
Quantifying recent erosion and sediment delivery using probability sampling: A case study
Jack Lewis
2002-01-01
Abstract - Estimates of erosion and sediment delivery have often relied on measurements from locations that were selected to be representative of particular terrain types. Such judgement samples are likely to overestimate or underestimate the mean of the quantity of interest. Probability sampling can eliminate the bias due to sample selection, and it permits the...
Estimating probability curves of rock variables using orthogonal polynomials and sample moments
DENG Jian; BIAN Li
2005-01-01
A new algorithm using orthogonal polynomials and sample moments was presented for estimating probability curves directly from experimental or field data of rock variables. The moments estimated directly from a sample of observed values of a random variable could be conventional moments (moments about the origin or central moments) and probability-weighted moments (PWMs). Probability curves derived from orthogonal polynomials and conventional moments are probability density functions (PDF), and probability curves derived from orthogonal polynomials and PWMs are inverse cumulative density functions (CDF) of random variables. The proposed approach is verified by two most commonly-used theoretical standard distributions: normal and exponential distribution. Examples from observed data of uniaxial compressive strength of a rock and concrete strength data are presented for illustrative purposes. The results show that probability curves of rock variable can be accurately derived from orthogonal polynomials and sample moments. Orthogonal polynomials and PWMs enable more secure inferences to be made from relatively small samples about an underlying probability curve.
Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino
2012-01-01
Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...
Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.
Ojeda, Mario Miguel; Sahai, Hardeo
2002-01-01
Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…
Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere
2011-01-01
Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling me...
A country-wide probability sample of public attitudes toward stuttering in Portugal.
Valente, Ana Rita S; St Louis, Kenneth O; Leahy, Margaret; Hall, Andreia; Jesus, Luis M T
2017-06-01
Negative public attitudes toward stuttering have been widely reported, although differences among countries and regions exist. Clear reasons for these differences remain obscure. Published research is unavailable on public attitudes toward stuttering in Portugal as well as a representative sample that explores stuttering attitudes in an entire country. This study sought to (a) determine the feasibility of a country-wide probability sampling scheme to measure public stuttering attitudes in Portugal using a standard instrument (the Public Opinion Survey of Human Attributes-Stuttering [POSHA-S]) and (b) identify demographic variables that predict Portuguese attitudes. The POSHA-S was translated to European Portuguese through a five-step process. Thereafter, a local administrative office-based, three-stage, cluster, probability sampling scheme was carried out to obtain 311 adult respondents who filled out the questionnaire. The Portuguese population held stuttering attitudes that were generally within the average range of those observed from numerous previous POSHA-S samples. Demographic variables that predicted more versus less positive stuttering attitudes were respondents' age, region of the country, years of school completed, working situation, and number of languages spoken. Non-predicting variables were respondents' sex, marital status, and parental status. A local administrative office-based, probability sampling scheme generated a respondent profile similar to census data and indicated that Portuguese attitudes are generally typical. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical model for degraded DNA samples and adjusted probabilities for allelic drop-out
Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt
2012-01-01
-outs. In this paper, we present a method for measuring the degree of degradation of a sample and demonstrate how to incorporate this in estimating the probability of allelic drop-out. This is done by extending an existing method derived for non-degraded samples. The performance of the methodology is evaluated using......DNA samples found at a scene of crime or obtained from the debris of a mass disaster accident are often subject to degradation. When using the STR DNA technology, the DNA profile is observed via a so-called electropherogram (EPG), where the alleles are identified as signal peaks above a certain...
Estimation of failure probabilities of linear dynamic systems by importance sampling
Anna Ivanova Olsen; Arvid Naess
2006-08-01
An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the ﬁrst iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the ﬁrst maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.
Dai, Mi; Wang, Yun
2016-06-01
In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the `joint lightcurve analysis (JLA)' data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best-fitting values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.
Effect of Sampling Depth on Air-Sea CO2 Flux Estimates in River-Stratified Arctic Coastal Waters
Miller, L. A.; Papakyriakou, T. N.
2015-12-01
In summer-time Arctic coastal waters that are strongly influenced by river run-off, extreme stratification severely limits wind mixing, making it difficult to effectively sample the surface 'mixed layer', which can be as shallow as 1 m, from a ship. During two expeditions in southwestern Hudson Bay, off the Nelson, Hayes, and Churchill River estuaries, we confirmed that sampling depth has a strong impact on estimates of 'surface' pCO2 and calculated air-sea CO2 fluxes. We determined pCO2 in samples collected from 5 m, using a typical underway system on the ship's seawater supply; from the 'surface' rosette bottle, which was generally between 1 and 3 m; and using a niskin bottle deployed at 1 m and just below the surface from a small boat away from the ship. Our samples confirmed that the error in pCO2 derived from typical ship-board versus small-boat sampling at a single station could be nearly 90 μatm, leading to errors in the calculated air-sea CO2 flux of more than 0.1 mmol/(m2s). Attempting to extrapolate such fluxes over the 6,000,000 km2 area of the Arctic shelves would generate an error approaching a gigamol CO2/s. Averaging the station data over a cruise still resulted in an error of nearly 50% in the total flux estimate. Our results have implications not only for the design and execution of expedition-based sampling, but also for placement of in-situ sensors. Particularly in polar waters, sensors are usually deployed on moorings, well below the surface, to avoid damage and destruction from drifting ice. However, to obtain accurate information on air-sea fluxes in these areas, it is necessary to deploy sensors on ice-capable buoys that can position the sensors in true 'surface' waters.
Carlisle, J B; Dexter, F; Pandit, J J; Shafer, S L; Yentis, S M
2015-07-01
In a previous paper, one of the authors (JBC) used a chi-squared method to analyse the means (SD) of baseline variables, such as height or weight, from randomised controlled trials by Fujii et al., concluding that the probabilities that the reported distributions arose by chance were infinitesimally small. Subsequent testing of that chi-squared method, using simulation, suggested that the method was incorrect. This paper corrects the chi-squared method and tests its performance and the performance of Monte Carlo simulations and ANOVA to analyse the probability of random sampling. The corrected chi-squared method and ANOVA method became inaccurate when applied to means that were reported imprecisely. Monte Carlo simulations confirmed that baseline data from 158 randomised controlled trials by Fujii et al. were different to those from 329 trials published by other authors and that the distribution of Fujii et al.'s data were different to the expected distribution, both p non-random (i.e. unreliable) data in randomised controlled trials submitted to journals. © 2015 The Association of Anaesthetists of Great Britain and Ireland.
Reliable Sampled-Data Control of Fuzzy Markovian Systems with Partly Known Transition Probabilities
Sakthivel, R.; Kaviarasan, B.; Kwon, O. M.; Rathika, M.
2016-08-01
This article presents a fuzzy dynamic reliable sampled-data control design for nonlinear Markovian jump systems, where the nonlinear plant is represented by a Takagi-Sugeno fuzzy model and the transition probability matrix for Markov process is permitted to be partially known. In addition, a generalised as well as more practical consideration of the real-world actuator fault model which consists of both linear and nonlinear fault terms is proposed to the above-addressed system. Then, based on the construction of an appropriate Lyapunov-Krasovskii functional and the employment of convex combination technique together with free-weighting matrices method, some sufficient conditions that promising the robust stochastic stability of system under consideration and the existence of the proposed controller are derived in terms of linear matrix inequalities, which can be easily solved by any of the available standard numerical softwares. Finally, a numerical example is provided to illustrate the validity of the proposed methodology.
EFFECTS OF VARYING THE PROBABILITY OF REINFORCEMENT ON MATCHING-TO-SAMPLE TASKS IN PIGEONS
CLAUDIO CARPIO
2003-07-01
Full Text Available The effects of varying the probability of reinforcement of responses to the identical (PSRi and different(PSRd comparison stimuli in matching to sample tasks were evaluated. PSRi was varied in descendentascendentway at the same time that PSRd was varied in an ascendent-descendent way. The values ofPSRi were 1.0, 0.75, 0.50, 0.25, 0.0, 0.25, 0.50, 0.75, 1.0; while the values of PERd were 0.0, 0.25, 0.50,0.75,1.0, 0.75, 0.50, 0.25, 0.0.The results show that the distribution of responses to the identical anddifferent comparison stimuli was a positive function of PSRi and PSRd values.
Valverde Arias, Omar; Valencia, José; Saa Requejo, António; Garrido, Alberto
2017-04-01
Based-index insurance for farming has become in an efficient alternative for farmers to transfer risk to another instances. Actually, in Ecuador, there is a conventional agricultural insurance for rice crop, although it is necessary to develop based-index insurance that could cover more farmers for extreme events with catastrophic consequences. This based-index insurance could consist to estimate crop losses by drought through NDVI (Normalized Difference Vegetation Index). A first step was to establish homogeneous areas based on Principal Component Analysis of soil properties (Valverde et al., 2016) where rice is cultivated. Two main areas were found (f7 and f15) that was based mainly on slope, texture and effective depth. These ones are the sites considered to sampling and study the NDVI. MODIS images of 250x250 m resolution were selected of the study area, Babahoyo canton (Los Rios province, Ecuador), and calculated the NDVI index at rice growth stage in both sites at several years. The number of samples in each site was proportional to the area of cultivated rice. NDVI distribution values were calculated in each homogeneous zones (f7 and f15) through years. Several statistical analysis were performed to investigate the difference between both sites. Results are discussed in the context of based index insurance.
Miller, Ezer; Huppert, Amit; Novikov, Ilya; Warburg, Alon; Hailu, Asrat; Abbasi, Ibrahim; Freedman, Laurence S
2015-11-10
In this work, we describe a two-stage sampling design to estimate the infection prevalence in a population. In the first stage, an imperfect diagnostic test was performed on a random sample of the population. In the second stage, a different imperfect test was performed in a stratified random sample of the first sample. To estimate infection prevalence, we assumed conditional independence between the diagnostic tests and develop method of moments estimators based on expectations of the proportions of people with positive and negative results on both tests that are functions of the tests' sensitivity, specificity, and the infection prevalence. A closed-form solution of the estimating equations was obtained assuming a specificity of 100% for both tests. We applied our method to estimate the infection prevalence of visceral leishmaniasis according to two quantitative polymerase chain reaction tests performed on blood samples taken from 4756 patients in northern Ethiopia. The sensitivities of the tests were also estimated, as well as the standard errors of all estimates, using a parametric bootstrap. We also examined the impact of departures from our assumptions of 100% specificity and conditional independence on the estimated prevalence. Copyright © 2015 John Wiley & Sons, Ltd.
Marcos Adami
2010-06-01
Full Text Available O objetivo deste trabalho foi avaliar o desempenho de um modelo probabilístico de amostragem estratificada por pontos, e definir um tamanho de amostra adequado para estimar a área cultivada com soja no Rio Grande do Sul. A área foi estratificada de acordo com a percentagem de soja cultivada em cada município do estado: menor que 20, de 20 a 40 e maior que 40%. Foram avaliadas estimativas obtidas por meio de seis tamanhos de amostras, resultantes da combinação de três níveis de significância (10, 5 e 1% e dois valores de erro amostral (5 e 2,5%. Para cada tamanho de amostra, foram realizados 400 sorteios aleatórios. As estimativas foram avaliadas com base na área de soja obtida de um mapa temático de referência proveniente de uma cuidadosa classificação automática e visual de imagens multitemporais dos satélites TM/Landsat-5 e ETM+/Landsat-7 disponível para a safra 2000/2001. A área de soja no Rio Grande do Sul pode ser estimada por meio de um modelo de amostragem probabilística estratificada por pontos, sendo que a melhor estimativa é obtida para o maior tamanho amostral (1.990 pontos, com diferença de apenas -0,14% em relação à estimativa do mapa de referência e um coeficiente de variação de 6,98%.The objective of this work was to evaluate the performance of a probabilistic sampling model stratified by points and to define an appropriate sample size to estimate the cultivated soybean area in the state of Rio Grande do Sul, Brazil. The area was stratified according to the percentage of soybean cultivated in each state municipality: less than 20, from 20 to 40 and more than 40%. Estimates were evaluated based on six sample sizes, resulting from the combination of three significance levels (10, 5 and 1% and two sampling errors (5 and 2,5%, choosing 400 random samples for each sample size. The estimates were compared to a reference soybean thematic map available for the crop year 2000/2001 that was derived from a careful
Bakri, Barbara; Weimer, Marco; Hauck, Gerrit; Reich, Gabriele
2015-11-01
Scope of the study was (1) to develop a lean quantitative calibration for real-time near-infrared (NIR) blend monitoring, which meets the requirements in early development of pharmaceutical products and (2) to compare the prediction performance of this approach with the results obtained from stratified sampling using a sample thief in combination with off-line high pressure liquid chromatography (HPLC) and at-line near-infrared chemical imaging (NIRCI). Tablets were manufactured from powder blends and analyzed with NIRCI and HPLC to verify the real-time results. The model formulation contained 25% w/w naproxen as a cohesive active pharmaceutical ingredient (API), microcrystalline cellulose and croscarmellose sodium as cohesive excipients and free-flowing mannitol. Five in-line NIR calibration approaches, all using the spectra from the end of the blending process as reference for PLS modeling, were compared in terms of selectivity, precision, prediction accuracy and robustness. High selectivity could be achieved with a "reduced" approach i.e. API and time saving approach (35% reduction of API amount) based on six concentration levels of the API with three levels realized by three independent powder blends and the additional levels obtained by simply increasing the API concentration in these blends. Accuracy and robustness were further improved by combining this calibration set with a second independent data set comprising different excipient concentrations and reflecting different environmental conditions. The combined calibration model was used to monitor the blending process of independent batches. For this model formulation the target concentration of the API could be achieved within 3 min indicating a short blending time. The in-line NIR approach was verified by stratified sampling HPLC and NIRCI results. All three methods revealed comparable results regarding blend end point determination. Differences in both mean API concentration and RSD values could be
Rosanowski, S M; Cogger, N; Rogers, C W; Benschop, J; Stevenson, M A
2012-12-01
We conducted a cross-sectional survey to determine the demographic characteristics of non-commercial horses in New Zealand. A sampling frame of properties with non-commercial horses was derived from the national farms database, AgriBase™. Horse properties were stratified by property size and a generalised random-tessellated stratified (GRTS) sampling strategy was used to select properties (n=2912) to take part in the survey. The GRTS sampling design allowed for the selection of properties that were spatially balanced relative to the distribution of horse properties throughout the country. The registered decision maker of the property, as identified in AgriBase™, was sent a questionnaire asking them to describe the demographic characteristics of horses on the property, including the number and reason for keeping horses, as well as information about other animals kept on the property and the proximity of boundary neighbours with horses. The response rate to the survey was 38% (1044/2912) and the response rate was not associated with property size or region. A total of 5322 horses were kept for recreation, competition, racing, breeding, stock work, or as pets. The reasons for keeping horses and the number and class of horses varied significantly between regions and by property size. Of the properties sampled, less than half kept horses that could have been registered with Equestrian Sports New Zealand or either of the racing codes. Of the respondents that reported knowing whether their neighbours had horses, 58.6% (455/776) of properties had at least one boundary neighbour that kept horses. The results of this study have important implications for New Zealand, which has an equine population that is naïve to many equine diseases considered endemic worldwide. The ability to identify, and apply accurate knowledge of the population at risk to infectious disease control strategies would lead to more effective strategies to control and prevent disease spread during an
Victimization and PTSD-like states in an Icelandic youth probability sample
Elklit Ask
2007-10-01
Full Text Available Abstract Background Although adolescence in many cases is a period of rebellion and experimentation with new behaviors and roles, the exposure of adolescents to life-threatening and violent events has rarely been investigated in national probability studies using a broad range of events. Methods In an Icelandic national representative sample of 206 9th-grade students (mean = 14.5 years, the prevalence of 20 potentially traumatic events and negative life events was reported, along with the psychological impact of these events. Results Seventy-four percent of the girls and 79 percent of the boys were exposed to at least one event. The most common events were the death of a family member, threat of violence, and traffic accidents. The estimated lifetime prevalence of posttraumatic stress disorder-like states (PTSD; DSM-IV, APA, 1994 1 was 16 percent, whereas another 12 percent reached a sub-clinical level of PTSD-like states (missing the full diagnosis with one symptom. Following exposure, girls suffered from PTSD-like states almost twice as often as boys. Gender, mothers' education, and single-parenthood were associated with specific events. The odds ratios and 95% CI for PTSD-like states given a specific event are reported. Being exposed to multiple potentially traumatic events was associated with an increase in PTSD-like states. Conclusion The findings indicate substantial mental health problems in adolescents that are associated with various types of potentially traumatic exposure.
Epidemiology of undiagnosed trichomoniasis in a probability sample of urban young adults.
Susan M Rogers
Full Text Available T. vaginalis infection (trichomoniasis is the most common curable sexually transmitted infection (STI in the U.S. It is associated with increased HIV risk and adverse pregnancy outcomes. Trichomoniasis surveillance data do not exist for either national or local populations. The Monitoring STIs Survey Program (MSSP collected survey data and specimens which were tested using nucleic acid amplification tests to monitor trichomoniasis and other STIs in 2006-09 among a probability sample of young adults (N = 2,936 in Baltimore, Maryland--an urban area with high rates of reported STIs. The estimated prevalence of trichomoniasis was 7.5% (95% CI 6.3, 9.1 in the overall population and 16.1% (95% CI 13.0, 19.8 among Black women. The overwhelming majority of infected men (98.5% and women (73.3% were asymptomatic. Infections were more common in both women (OR = 3.6, 95% CI 1.6, 8.2 and men (OR = 9.0, 95% CI 1.8, 44.3 with concurrent chlamydial infection. Trichomoniasis did not vary significantly by age for either men or women. Women with two or more partners in the past year and women with a history of personal or partner incarceration were more likely to have an infection. Overall, these results suggest that routine T vaginalis screening in populations at elevated risk of infection should be considered.
Epidemiology of undiagnosed trichomoniasis in a probability sample of urban young adults.
Rogers, Susan M; Turner, Charles F; Hobbs, Marcia; Miller, William C; Tan, Sylvia; Roman, Anthony M; Eggleston, Elizabeth; Villarroel, Maria A; Ganapathi, Laxminarayana; Chromy, James R; Erbelding, Emily
2014-01-01
T. vaginalis infection (trichomoniasis) is the most common curable sexually transmitted infection (STI) in the U.S. It is associated with increased HIV risk and adverse pregnancy outcomes. Trichomoniasis surveillance data do not exist for either national or local populations. The Monitoring STIs Survey Program (MSSP) collected survey data and specimens which were tested using nucleic acid amplification tests to monitor trichomoniasis and other STIs in 2006-09 among a probability sample of young adults (N = 2,936) in Baltimore, Maryland--an urban area with high rates of reported STIs. The estimated prevalence of trichomoniasis was 7.5% (95% CI 6.3, 9.1) in the overall population and 16.1% (95% CI 13.0, 19.8) among Black women. The overwhelming majority of infected men (98.5%) and women (73.3%) were asymptomatic. Infections were more common in both women (OR = 3.6, 95% CI 1.6, 8.2) and men (OR = 9.0, 95% CI 1.8, 44.3) with concurrent chlamydial infection. Trichomoniasis did not vary significantly by age for either men or women. Women with two or more partners in the past year and women with a history of personal or partner incarceration were more likely to have an infection. Overall, these results suggest that routine T vaginalis screening in populations at elevated risk of infection should be considered.
Virginia Homfray
Full Text Available It is well-established that male circumcision reduces acquisition of HIV, herpes simplex virus 2, chancroid, and syphilis. However, the effect on the acquisition of non-ulcerative sexually transmitted infections (STIs remains unclear. We examined the relationship between circumcision and biological measures of three STIs: human papillomavirus (HPV, Chlamydia trachomatis and Mycoplasma genitalium.A probability sample survey of 15,162 men and women aged 16-74 years (including 4,060 men aged 16-44 years was carried out in Britain between 2010 and 2012. Participants completed a computer-assisted personal interview, including a computer-assisted self-interview, which asked about experience of STI diagnoses, and circumcision. Additionally, 1,850 urine samples from sexually-experienced men aged 16-44 years were collected and tested for STIs. Multivariable logistic regression was used to calculate adjusted odds ratios (AOR to quantify associations between circumcision and i self-reporting any STI diagnosis and ii presence of STIs in urine, in men aged 16-44 years, adjusting for key socio-demographic and sexual behavioural factors.The prevalence of circumcision in sexually-experienced men aged 16-44 years was 17.4% (95%CI 16.0-19.0. There was no association between circumcision and reporting any previous STI diagnoses, and specifically previous chlamydia or genital warts. However, circumcised men were less likely to have any HPV type (AOR 0.26, 95% confidence interval (CI 0.13-0.50 including high-risk HPV types (HPV-16, 18, 31, 33, 35, 39, 45, 51, 52, 56, 58, 59 and/or 68 (AOR 0.14, 95% CI 0.05-0.40 detected in urine.Circumcised men had reduced odds of HPV detection in urine. These findings have implications for improving the precision of models of STI transmission in populations with different circumcision prevalence and in designing interventions to reduce STI acquisition.
Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb
2008-01-01
of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to be automatically estimated (not just predicted), unbiased - for all estimators and at no extra cost to the user.......The proportionator is a novel and radically different approach to sampling with microscopes based on well-known statistical theory (probability proportional to size - PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...
Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...
Lapeyrouse, L M; Morera, O; Heyman, J M C; Amaya, M A; Pingitore, N E; Balcazar, H
2012-04-01
Examination of border-specific characteristics such as trans-border mobility and transborder health service illuminates the heterogeneity of border Hispanics and may provide greater insight toward understanding differential health behaviors and status among these populations. In this study, we create a descriptive profile of the concept of trans-border mobility by exploring the relationship between mobility status and a series of demographic, economic and socio-cultural characteristics among mobile and non-mobile Hispanics living in the El Paso-Juarez border region. Using a two-stage stratified random sampling design, bilingual interviewers collected survey data from border residents (n = 1,002). Findings show that significant economic, cultural, and behavioral differences exist between mobile and non-mobile respondents. While non-mobile respondents were found to have higher social economic status than their mobile counterparts, mobility across the border was found to offer less acculturated and poorer Hispanics access to alternative sources of health care and other services.
Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro
2016-02-01
Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.
Williams, Michael S; Ebel, Eric D
2014-11-18
The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the
Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...
2015-01-01
The premarital sex of senior students in some universities of Anhui province is investigated. To protect the privacy of respondents, applying randomized response technique and stratified three-stage method, the proportion of senior students premari-tal sex is studied using attribute characteristic Warner model. According to total probability formulas and variance's basic properties in Probability and Mathematical Statistics and the classical sampling theory of Cochran, the proportion and variance of senior college students premarital sex are deduced at all levels and stages. The survey reveals that the proportion of senior students premarital sex is high. Therefore, we should actively instruct the undergraduates to treat the issues of premarital sex properly and rationally.%对安徽省某高校大四学生婚前性行为进行抽样调查,为保护被调查对象的隐私,采用随机应答技术( Random-ized Response Technique,简写为RRT)结合分层三阶段抽样调查方法,利用属性特征敏感问题Warner模型分析该校大四学生发生婚前性行为的比例。运用全概率公式及方差的基本性质等概率论与数理统计知识,结合Cochran W. G的经典抽样理论,推导出各层各阶段大四学生发生婚前性行为的比例及其方差。调查结果显示大四学生婚前性行为发生比例高。为此,应该积极引导大学生理性正确的对待婚前性行为。
Eide, Helene K; Šaltytė Benth, Jūratė; Sortland, Kjersti; Halvorsen, Kristin; Almendingen, Kari
2015-01-01
There is a lack of accurate prevalence data on undernutrition and the risk of undernutrition among the hospitalised elderly in Europe and Norway. We aimed at estimating the prevalence of nutritional risk by using stratified sampling along with adequate power calculations. A cross-sectional study was carried out in the period 2011 to 2013 at a university hospital in Norway. Second-year nursing students in acute care clinical studies in twenty hospital wards screened non-demented elderly patients for nutritional risk, by employing the Nutritional Risk Screening 2002 (NRS2002) form. In total, 508 patients (48·8 % women and 51·2 % men) with a mean age of 79·6 (sd 6·4) years were screened by the students. Mean BMI was 24·9 (sd 4·9) kg/m(2), and the patients had been hospitalised for on average 5·3 (sd 6·3) d. WHO's BMI cut-off values identified 6·5 % as underweight, 48·0 % of normal weight and 45·5 % as overweight. Patients nutritionally at risk had been in hospital longer and had lower average weight and BMI compared with those not at risk (all P nutritional risk was estimated to be 45·4 (95 % CI 41·7 %, 49·0) %, ranging between 20·0 and 65·0 % on different hospital wards. The present results show that the prevalence of nutritional risk among elderly patients without dementia is high, suggesting that a large proportion of the hospitalised elderly are in need of nutritional treatment.
PIGS: improved estimates of identity-by-descent probabilities by probabilistic IBD graph sampling.
Park, Danny S; Baran, Yael; Hormozdiari, Farhad; Eng, Celeste; Torgerson, Dara G; Burchard, Esteban G; Zaitlen, Noah
2015-01-01
Identifying segments in the genome of different individuals that are identical-by-descent (IBD) is a fundamental element of genetics. IBD data is used for numerous applications including demographic inference, heritability estimation, and mapping disease loci. Simultaneous detection of IBD over multiple haplotypes has proven to be computationally difficult. To overcome this, many state of the art methods estimate the probability of IBD between each pair of haplotypes separately. While computationally efficient, these methods fail to leverage the clique structure of IBD resulting in less powerful IBD identification, especially for small IBD segments.
Cheon, Sooyoung
2013-02-16
Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.
Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy
2007-07-01
Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.
Statistical model for degraded DNA samples and adjusted probabilities for allelic drop-out
Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt;
2012-01-01
Abstract DNA samples found at a scene of crime or obtained from the debris of a mass disaster accident are often subject to degradation. When using the STR DNA technology, the DNA profile is observed via a so-called electropherogram (EPG), where the alleles are identified as signal peaks above a ...
Comments on “Estimating Income Variances by Probability Sampling: A Case Study by Shah and Aleem”
Jamal Abdul Nasir
2012-06-01
Full Text Available In this article, we wish to write comments on recently published article “Shah, A.A. and Aleem, M. (2010. Estimating income variances by probability sampling: a case study. Pakistan Journal of Commerce and Social Sciences, 4(2, 194-201”, which suggest improvement as well as criticism on the paper and also contribute effectively towardsjournal repute and ranking.
Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H
2016-09-01
To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
伍长春; 张润楚
2006-01-01
In stratified survey sampling, sometimes we have complete auxiliary information. One of the fundamental questions is how to effectively use the complete auxiliary information at the estimation stage. In this paper, we extend the model-calibration method to obtain estimators of the finite population mean by using complete auxiliary information from stratified sampling survey data. We show that the resulting estimators effectively use auxiliary information at the estimation stage and possess a number of attractive features such as asymptotically design-unbiased irrespective of the working model and approximately model-unbiased under the model. When a linear working-model is used, the resulting estimators reduce to the usual calibration estimator(or GREG).
Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo
2014-08-01
Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food.
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.; Halappanavar, Mahantesh
2016-09-16
Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker and system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.
Gardi, J E; Nyengaard, J R; Gundersen, H J G
2008-03-01
The proportionator is a novel and radically different approach to sampling with microscopes based on the well-known statistical theory (probability proportional to size-PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section a weight proportional to some characteristic of the structure under study. A typical and very simple example, examined here, is the amount of color characteristic for the structure, marked with a stain with known properties. The color may be specific or not. In the recorded list of weights in all fields, the desired number of fields is sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections examined, which in turn leads to any of the known stereological estimates including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator is 2-15-fold more efficient than the common systematic, uniformly random sampling. The simulations also indicate that the lack of a simple predictor of the coefficient of error (CE) due to field-to-field variation is a more severe problem for uniform sampling strategies than anticipated. Because of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to
Bogaert, Anthony F
2003-03-01
The extent to which number of older brothers or "fraternal birth order" predicted the 2 main components that researchers have traditionally used to conceptualize sexual orientation-that is, psychological attraction and sexual behavior-was examined in 2 recent national probability samples. In both studies, fraternal birth order predicted same-sex attraction in men, with each additional older brother increasing the odds of homosexual attraction by an average of 38%. Results also indicated that the fraternal birth order/same-sex attraction relationship in men was independent of sexual behavior, including early same-sex behavior. No sibling characteristics predicted sexual orientation in women. Results suggest experience-based theories (e.g., early same-sex play) of the fraternal birth order effect in men are unlikely to be correct.
Letant, S E; Kane, S R; Murphy, G A; Alfaro, T M; Hodges, L; Rose, L; Raber, E
2008-05-30
This note presents a comparison of Most-Probable-Number Rapid Viability (MPN-RV) PCR and traditional culture methods for the quantification of Bacillus anthracis Sterne spores in macrofoam swabs generated by the Centers for Disease Control and Prevention (CDC) for a multi-center validation study aimed at testing environmental swab processing methods for recovery, detection, and quantification of viable B. anthracis spores from surfaces. Results show that spore numbers provided by the MPN RV-PCR method were in statistical agreement with the CDC conventional culture method for all three levels of spores tested (10{sup 4}, 10{sup 2}, and 10 spores) even in the presence of dirt. In addition to detecting low levels of spores in environmental conditions, the MPN RV-PCR method is specific, and compatible with automated high-throughput sample processing and analysis protocols.
Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo
2015-01-01
is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...... extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... collected in Cagliari, Italy, shows the feasibility of generating routes stochastically in a high-resolution network and calculating the correction factor. The model estimation with and without correction illustrates how the correction not only improves the goodness of fit but also turns illogical signs...
Herbenick, Debby; Friedman, M. Reuel; Schick, Vanessa; Fu, Tsung-Chieh (Jane); Bostwick, Wendy; Bartelt, Elizabeth; Muñoz-Laboy, Miguel; Pletta, David; Reece, Michael; Sandfort, Theo G. M.
2016-01-01
As bisexual individuals in the United States (U.S.) face significant health disparities, researchers have posited that these differences may be fueled, at least in part, by negative attitudes, prejudice, stigma, and discrimination toward bisexual individuals from heterosexual and gay/lesbian individuals. Previous studies of individual and social attitudes toward bisexual men and women have been conducted almost exclusively with convenience samples, with limited generalizability to the broader U.S. population. Our study provides an assessment of attitudes toward bisexual men and women among a nationally representative probability sample of heterosexual, gay, lesbian, and other-identified adults in the U.S. Data were collected from the 2015 National Survey of Sexual Health and Behavior (NSSHB), via an online questionnaire with a probability sample of adults (18 years and over) from throughout the U.S. We included two modified 5-item versions of the Bisexualities: Indiana Attitudes Scale (BIAS), validated sub-scales that were developed to measure attitudes toward bisexual men and women. Data were analyzed using descriptive statistics, gamma regression, and paired t-tests. Gender, sexual identity, age, race/ethnicity, income, and educational attainment were all significantly associated with participants' attitudes toward bisexual individuals. In terms of responses to individual scale items, participants were most likely to “neither agree nor disagree” with all attitudinal statements. Across sexual identities, self-identified other participants reported the most positive attitudes, while heterosexual male participants reported the least positive attitudes. As in previous research on convenience samples, we found a wide range of demographic characteristics were related with attitudes toward bisexual individuals in our nationally-representative study of heterosexual, gay/lesbian, and other-identified adults in the U.S. In particular, gender emerged as a significant
Herek, Gregory M; Norton, Aaron T; Allen, Thomas J; Sims, Charles L
2010-09-01
Using data from a US national probability sample of self-identified lesbian, gay, and bisexual adults (N = 662), this article reports population parameter estimates for a variety of demographic, psychological, and social variables. Special emphasis is given to information with relevance to public policy and law. Compared with the US adult population, respondents were younger, more highly educated, and less likely to be non-Hispanic White, but differences were observed between gender and sexual orientation groups on all of these variables. Overall, respondents tended to be politically liberal, not highly religious, and supportive of marriage equality for same-sex couples. Women were more likely than men to be in a committed relationship. Virtually all coupled gay men and lesbians had a same-sex partner, whereas the vast majority of coupled bisexuals were in a heterosexual relationship. Compared with bisexuals, gay men and lesbians reported stronger commitment to a sexual-minority identity, greater community identification and involvement, and more extensive disclosure of their sexual orientation to others. Most respondents reported experiencing little or no choice about their sexual orientation. The importance of distinguishing among lesbians, gay men, bisexual women, and bisexual men in behavioral and social research is discussed.
Whisman, Mark A
2016-12-01
Prior research has found that humiliating marital events are associated with depression. Building on this research, the current study investigated the association between one specific humiliating marital event-discovering that one's partner had an affair-and past-year major depressive episode (MDE) in a probability sample of married or cohabiting men and women who were at high risk for depression based on the criterion that they scored below the midpoint on a measure of marital satisfaction (N = 227). Results indicate that (i) women were more likely than men to report discovering their partner had an affair in the prior 12 months; (ii) discovering a partner affair was associated with a higher prevalence of past-year MDE and a lower level of marital adjustment; and (iii) the association between discovering a partner affair and MDE remained statistically significant when holding constant demographic variables and marital adjustment. These results support continued investigation into the impact that finding out about an affair has on the mental health of the person discovering a partner affair. © 2015 Family Process Institute.
Norton, Aaron T.; Allen, Thomas J.; Sims, Charles L.
2010-01-01
Using data from a US national probability sample of self-identified lesbian, gay, and bisexual adults (N = 662), this article reports population parameter estimates for a variety of demographic, psychological, and social variables. Special emphasis is given to information with relevance to public policy and law. Compared with the US adult population, respondents were younger, more highly educated, and less likely to be non-Hispanic White, but differences were observed between gender and sexual orientation groups on all of these variables. Overall, respondents tended to be politically liberal, not highly religious, and supportive of marriage equality for same-sex couples. Women were more likely than men to be in a committed relationship. Virtually all coupled gay men and lesbians had a same-sex partner, whereas the vast majority of coupled bisexuals were in a heterosexual relationship. Compared with bisexuals, gay men and lesbians reported stronger commitment to a sexual-minority identity, greater community identification and involvement, and more extensive disclosure of their sexual orientation to others. Most respondents reported experiencing little or no choice about their sexual orientation. The importance of distinguishing among lesbians, gay men, bisexual women, and bisexual men in behavioral and social research is discussed. PMID:20835383
Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C
2012-02-15
The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Sevelius, Jae M.
2017-01-01
Background. Transgender individuals have a gender identity that differs from the sex they were assigned at birth. The population size of transgender individuals in the United States is not well-known, in part because official records, including the US Census, do not include data on gender identity. Population surveys today more often collect transgender-inclusive gender-identity data, and secular trends in culture and the media have created a somewhat more favorable environment for transgender people. Objectives. To estimate the current population size of transgender individuals in the United States and evaluate any trend over time. Search methods. In June and July 2016, we searched PubMed, Cumulative Index to Nursing and Allied Health Literature, and Web of Science for national surveys, as well as “gray” literature, through an Internet search. We limited the search to 2006 through 2016. Selection criteria. We selected population-based surveys that used probability sampling and included self-reported transgender-identity data. Data collection and analysis. We used random-effects meta-analysis to pool eligible surveys and used meta-regression to address our hypothesis that the transgender population size estimate would increase over time. We used subsample and leave-one-out analysis to assess for bias. Main results. Our meta-regression model, based on 12 surveys covering 2007 to 2015, explained 62.5% of model heterogeneity, with a significant effect for each unit increase in survey year (F = 17.122; df = 1,10; b = 0.026%; P = .002). Extrapolating these results to 2016 suggested a current US population size of 390 adults per 100 000, or almost 1 million adults nationally. This estimate may be more indicative for younger adults, who represented more than 50% of the respondents in our analysis. Authors’ conclusions. Future national surveys are likely to observe higher numbers of transgender people. The large variety in questions used to ask
Tchoubi, Sébastien; Sobngwi-Tambekou, Joëlle; Noubiap, Jean Jacques N.; Asangbeh, Serra Lem; Nkoum, Benjamin Alexandre; Sobngwi, Eugene
2015-01-01
Background Childhood obesity is one of the most serious public health challenges of the 21st century. The prevalence of overweight and obesity among children (obesity among children aged 6 months to 5 years in Cameroon in 2011. Methods Four thousand five hundred and eighteen children (2205 boys and 2313 girls) aged between 6 to 59 months were sampled in the 2011 Demographic Health Survey (DHS) database. Body Mass Index (BMI) z-scores based on WHO 2006 reference population was chosen to estimate overweight (BMI z-score > 2) and obesity (BMI for age > 3). Regression analyses were performed to investigate risk factors of overweight/obesity. Results The prevalence of overweight and obesity was 8% (1.7% for obesity alone). Boys were more affected by overweight than girls with a prevalence of 9.7% and 6.4% respectively. The highest prevalence of overweight was observed in the Grassfield area (including people living in West and North-West regions) (15.3%). Factors that were independently associated with overweight and obesity included: having overweight mother (adjusted odds ratio (aOR) = 1.51; 95% CI 1.15 to 1.97) and obese mother (aOR = 2.19; 95% CI = 155 to 3.07), compared to having normal weight mother; high birth weight (aOR = 1.69; 95% CI 1.24 to 2.28) compared to normal birth weight; male gender (aOR = 1.56; 95% CI 1.24 to 1.95); low birth rank (aOR = 1.35; 95% CI 1.06 to 1.72); being aged between 13–24 months (aOR = 1.81; 95% CI = 1.21 to 2.66) and 25–36 months (aOR = 2.79; 95% CI 1.93 to 4.13) compared to being aged 45 to 49 months; living in the grassfield area (aOR = 2.65; 95% CI = 1.87 to 3.79) compared to living in Forest area. Muslim appeared as a protective factor (aOR = 0.67; 95% CI 0.46 to 0.95).compared to Christian religion. Conclusion This study underlines a high prevalence of early childhood overweight with significant disparities between ecological areas of Cameroon. Risk factors of overweight included high maternal BMI, high birth weight, male
Subjective Interpretation of Probability under the Analysis of Sampling Paradox%从抽样悖论看概率的主观解释
黄闪闪; 李铁
2014-01-01
在科学推理中，一直存在着主观概率与客观概率的争议。频率主义者坚持概率的频率解释，认为概率只能是客观的。但是随机抽样悖论表明，经典统计推理不可能避免随机样本的主观性。贝叶斯主义者根据概率的主观解释，用赌商的方式量化先验知识，协调了主客观之间的矛盾。概率的应用性表明，概率解释的选择是多元的，主体交互概率解释为主观解释提供了一条融合客观因素的新进路。%In scientific reasoning,there is a source of debate between subjective probability and ob-jective probability.Frequentists insist on the frequency interpretation of probability;they take proba-bilities as objective.However,the paradox of random sampling shows that the subjectivity of random sampling cannot be avoided in statistical reasoning.According to the subjective interpretation of prob-ability,and the quantification of prior knowledge,the contradiction between the subjective and objec-tive is reconciled by Bayesians.The applicability of probability shows that the selection of probability interpretation is multiple,that is,there are various viable interpretations of probability.To introduce objective factors,this is a new approach to subjective probability in accordance with the inter-subjec-tive probability.
Fluttering in Stratified Flows
Lam, Try; Vincent, Lionel; Kanso, Eva
2016-11-01
The descent motion of heavy objects under the influence of gravitational and aerodynamic forces is relevant to many branches of engineering and science. Examples range from estimating the behavior of re-entry space vehicles to studying the settlement of marine larvae and its influence on underwater ecology. The behavior of regularly shaped objects freely falling in homogeneous fluids is relatively well understood. For example, the complex interaction of a rigid coin with the surrounding fluid will cause it to either fall steadily, flutter, tumble, or be chaotic. Less is known about the effect of density stratification on the descent behavior. Here, we experimentally investigate the descent of discs in both pure water and in a linearly salt-stratified fluids where the density is varied from 1.0 to 1.14 of that of water where the Brunt-Vaisala frequency is 1.7 rad/sec and the Froude number Fr robots for space exploration and underwater missions.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Stably stratified magnetized stars in general relativity
Yoshida, Shijun; Shibata, Masaru
2012-01-01
We construct magnetized stars composed of a fluid stably stratified by entropy gradients in the framework of general relativity, assuming ideal magnetohydrodynamics and employing a barotropic equation of state. We first revisit basic equations for describing stably-stratified stationary axisymmetric stars containing both poloidal and toroidal magnetic fields. As sample models, the magnetized stars considered by Ioka and Sasaki (2004), inside which the magnetic fields are confined, are modified to the ones stably stratified. The magnetized stars newly constructed in this study are believed to be more stable than the existing relativistic models because they have both poloidal and toroidal magnetic fields with comparable strength, and magnetic buoyancy instabilities near the surface of the star, which can be stabilized by the stratification, are suppressed.
Fredslund, Line; Ekelund, Flemming; Jacobsen, Carsten Suhr
2001-01-01
This paper reports on the first successful molecular detection and quantification of soil protozoa. Quantification of heterotrophic flagellates and naked amoebae in soil has traditionally relied on dilution culturing techniques, followed by most-probable-number (MPN) calculations. Such methods...... are biased by differences in the culturability of soil protozoa and are unable to quantify specific taxonomic groups, and the results are highly dependent on the choice of media and the skills of the microscopists. Successful detection of protozoa in soil by DNA techniques requires (i) the development...
Andersen, Mikkel Meyer; Mogensen, Helle Smidt; Eriksen, Poul Svante; Morling, Niels
2017-03-07
The Yfiler (Ⓡ) Plus Amplification Kit amplifies 27 Y chromosomal small tandem repeat (STR) markers. The kit has five-fluorescent dye chemistry and the improved PCR buffer system of modern STR kits. We validated the kit for accredited investigations of crime scene samples by a thorough study of kit dynamics and performance. We determined dye-dependent analytical thresholds by receiver operating characteristics (ROC) and made a customised artefact filter that includes theoretical known artefacts by use of previously analysed population samples. Dilution series of known male DNA and a selection of crime scene samples were analysed with the customised thresholds and artefact filters. The Yfiler (Ⓡ) Plus Amplification Kit was sensitive giving full profiles down to 70 pg of male DNA. The balances between the fluorescent dyes as well as between loci were very good. The kit was able to produce full Y-STR profiles from crime scene samples containing small amounts of male DNA and large amounts of female DNA (although unspecific reactions were evident for very unbalanced mixtures). A decrease in the drop-out rate was found for both the dilution series and population samples, as well as a small increase in the drop-in rate for population samples, using the customised threshold and artefact filters compared to company-provided thresholds and artefact filters. The additional drop-ins were all of a nature that would be detected by inspection of the results. For the crime scene samples, large amounts of female DNA complicated the analysis by causing drop-ins of characteristic female DNA artefacts. Even though the customised analytical threshold in combination with the custom-made artefact filters gave more alleles, crime scene samples still needed special attention from the forensic geneticist.
Brus, D.J.; Slim, P.A.; Heidema, A.H.; Dobben, van H.F.
2014-01-01
The European Habitats Directive requires a regular reporting of areal changes of the Habitat types definedunder this Directive. To monitor changes in Habitat types in a dune and salt meadow area in the easternpart of the back-barrier island of Ameland (The Netherlands) a sampling scheme was designed
Chen, Cao; Xiao, Di; Zhou, Wei; Zhang, Yong-Chan; Shi, Qi; Tian, Chan; Zhang, Jin; Zhou, Chun-Xi; Zhang, Jian-Zhong; Dong, Xiao-Ping
2012-01-01
The shotgun strategy applying tandem mass spectrometry has been widely used to identify the proteins that are differentially distributed among diseases for its high reliability and efficiency. To find out the potential difference of protein profiles in cerebrospinal fluids (CSF) between Creutzfeldt-Jakob disease (CJD) and non-CJD patients, especially in the fraction ranging from 1-10 KD, the CSF samples of 40 probable sporadic CJD (sCJD) patients, 32 non-CJD cases with dementia and 17 non-CJD cases without dementia were separately pooled and enriched by the magnetic beads based weak cation exchange chromatography (MB-WCX). After trypsin digestion, each enriched CSF was separated and identified by RP-HPLC-ESI-QTOF MS/MS. In total, 42, 53 and 47 signals of proteins were identified in the pooled CSF fraction less than 10 KD of probable sCJD, non-CJD with dementia and non-CJD without dementia, respectively. Compared with that of probable sCJD, the similarity of CSF protein profiles of non-CJD with dementia (76.2%) were higher than that of non-CJD without dementia (57.1%). Nine CSF proteins were found to be specially observed in probable sCJD group. Those data may help to select the potential biomarkers for diagnosis of CJD. Additionally, further studies of the small segments of cellular proteins in CSF of CJD patients may also provide scientific clues for understanding the neuropathogenesis of TSEs.
Auchincloss, P.S.; De Barbaro, P.; Bodek, A.; Budd, H.; Pillai, M.; Qun, F.; Sakumoto, W.K.; Merritt, F.S.; Oreglia, M.J.; Schumm, B.; Bolton, T.; Arroyo, C.; Bachmann, K.T.; Bazarko, A.O.; Blair, R.E.; Foudas, C.; King, B.J.; Lefmann, W.C.; Leung, W.C.; Mishra, S.R.; Oltman, E.; Quintas, P.Z.; Rabinowitz, S.A.; Sciulli, F.; Seligman, W.G.; Shaevitz, M.H.; Bernstein, R.H.; Borcherding, F.; Fisk, H.E.; Lamm, M.; Marsh, W.; Merritt, K.W.B.; Schellman, H.; Yovanovitch, D.; Kinnel, T.S.; Sandler, P.; Smith, W.H. (Dept. of Physics and Astronomy, Univ. of Rochester, NY (United States) Dept. of Physics, Univ. of Chicago, IL (United States) Dept. of Physics, Columbia Univ. New York, NY (United States) Fermilab, Batavia, IL (United States) Dept. of Physics, Univ. of Wisconsin, Madison, WI (United States))
1994-04-11
We have extracted the momentum dependence of the mean, the truncated mean and the most probable value of the energy deposited in a segmented, iron-scintillator, hadron calorimeter by high-energy muons. Data were drawn from a sample of momentum-analyzed, high-energy muons produced in charged-current neutrino interactions. The truncated mean energy deposition of high-energy muons traversing 20 calorimeter segments increases by approximately 16% per 100 GeV/c increase in muon momentum over the range 25-125 GeV/c; the most probable energy deposition increases by approximately 7%. These results are important for experiments at high-energy colliders (e.g., Tevatron, SSC and LHC) which use the dE/dx of high-energy muons to calibrate the response of electromagnetic and hadron calorimeters with tower geometry. The data are in qualitative agreement with GEANT3 (v3.15/308a) simulations. (orig.)
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Elmore, Stacey A; Huyvaert, Kathryn P; Bailey, Larissa L; Iqbal, Asma; Su, Chunlei; Dixon, Brent R; Alisauskas, Ray T; Gajadhar, Alvin A; Jenkins, Emily J
2016-08-01
Increasingly, birds are recognised as important hosts for the ubiquitous parasite Toxoplasma gondii, although little experimental evidence exists to determine which tissues should be tested to maximise the detection probability of T. gondii. Also, Arctic-nesting geese are suspected to be important sources of T. gondii in terrestrial Arctic ecosystems, but the parasite has not previously been reported in the tissues of these geese. Using a domestic goose model, we applied a multi-scale occupancy framework to demonstrate that the probability of detection of T. gondii was highest in the brain (0.689, 95% confidence interval=0.486, 0.839) and the heart (0.809, 95% confidence interval=0.693, 0.888). Inoculated geese had an estimated T. gondii infection probability of 0.849, (95% confidence interval=0.643, 0.946), highlighting uncertainty in the system, even under experimental conditions. Guided by these results, we tested the brains and hearts of wild Ross's Geese (Chen rossii, n=50) and Lesser Snow Geese (Chen caerulescens, n=50) from Karrak Lake, Nunavut, Canada. We detected 51 suspected positive tissue samples from 33 wild geese using real-time PCR with melt-curve analysis. The wild goose prevalence estimates generated by our multi-scale occupancy analysis were higher than the naïve estimates of prevalence, indicating that multiple PCR repetitions on the same organs and testing more than one organ could improve T. gondii detection. Genetic characterisation revealed Type III T. gondii alleles in six wild geese and Sarcocystis spp. in 25 samples. Our study demonstrates that Arctic nesting geese are capable of harbouring T. gondii in their tissues and could transport the parasite from their southern overwintering grounds into the Arctic region. We demonstrate how a multi-scale occupancy framework can be used in a domestic animal model to guide resource-limited sample collection and tissue analysis in wildlife. Secondly, we confirm the value of traditional occupancy in
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Electromagnetic waves in stratified media
Wait, James R; Fock, V A; Wait, J R
2013-01-01
International Series of Monographs in Electromagnetic Waves, Volume 3: Electromagnetic Waves in Stratified Media provides information pertinent to the electromagnetic waves in media whose properties differ in one particular direction. This book discusses the important feature of the waves that enables communications at global distances. Organized into 13 chapters, this volume begins with an overview of the general analysis for the electromagnetic response of a plane stratified medium comprising of any number of parallel homogeneous layers. This text then explains the reflection of electromagne
Stratified medicine and reimbursement issues
Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten
2012-01-01
Stratified Medicine (SM) has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to
Three Samples Applied With Monte Carlo Method in Probability and Statistics%概率统计中蒙特卡罗方法应用三例
杨晓霞
2014-01-01
本文以在概率统计教学中发现的三个典型问题（民航送客、估计量的有效性、由中心极限定理估算概率）为例，借助于 R软件，给出了用蒙特卡罗（Monte Carlo）方法模拟求解的过程。这些例子可用于概率统计课程的实验教学。%Monte Carlo method is powerful for some complicated problems in Probability and Statistics. Three typical problems are given in this paper to show how to simulate the solution with Monte Carlo method, because they are often confounded and beyond students’ comprehension. Otherwise, these samples can be used in experimental teaching.
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Judith P Andersen
Full Text Available BACKGROUND: Adverse childhood experiences (e.g., physical, sexual and emotional abuse, neglect, exposure to domestic violence, parental discord, familial mental illness, incarceration and substance abuse constitute a major public health problem in the United States. The Adverse Childhood Experiences (ACE scale is a standardized measure that captures multiple developmental risk factors beyond sexual, physical and emotional abuse. Lesbian, gay, and bisexual (i.e., sexual minority individuals may experience disproportionately higher prevalence of adverse childhood experiences. PURPOSE: To examine, using the ACE scale, prevalence of childhood physical, emotional, and sexual abuse and childhood household dysfunction among sexual minority and heterosexual adults. METHODS: Analyses were conducted using a probability-based sample of data pooled from three U.S. states' Behavioral Risk Factor Surveillance System (BRFSS surveys (Maine, Washington, Wisconsin that administered the ACE scale and collected information on sexual identity (n = 22,071. RESULTS: Compared with heterosexual respondents, gay/lesbian and bisexual individuals experienced increased odds of six of eight and seven of eight adverse childhood experiences, respectively. Sexual minority persons had higher rates of adverse childhood experiences (IRR = 1.66 gay/lesbian; 1.58 bisexual compared to their heterosexual peers. CONCLUSIONS: Sexual minority individuals have increased exposure to multiple developmental risk factors beyond physical, sexual and emotional abuse. We recommend the use of the Adverse Childhood Experiences scale in future research examining health disparities among this minority population.
Information content of household-stratified epidemics
T.M. Kinyanjui
2016-09-01
Full Text Available Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.
Stratified Medicine and Reimbursement Issues
Hans-Joerg eFugel
2012-10-01
Full Text Available Stratified Medicine (SM has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to strengthen the value proposition to pricing and reimbursement (P&R authorities. However, the introduction of SM challenges current reimbursement schemes in many EU countries and the US as different P&R policies have been adopted for drugs and diagnostics. Also, there is a lack of a consistent process for value assessment of more complex diagnostics in these markets. New, innovative approaches and more flexible P&R systems are needed to reflect the added value of diagnostic tests and to stimulate investments in new technologies. Yet, the framework for access of diagnostic–based therapies still requires further development while setting the right incentives and appropriate align stakeholders interests when realizing long- term patient benefits. This article addresses the reimbursement challenges of SM approaches in several EU countries and the US outlining some options to overcome existing reimbursement barriers for stratified medicine.
[Phylogenetic diversity of bacteria in soda lake stratified sediments].
Tourova, T P; Grechnikova, M A; Kuznetsov, V V; Sorokin, D Yu
2014-01-01
Various previously developed techniques for DNA extraction from the samples with complex physicochemical structure (soils, silts, and sediments) and modifications of these techniques developed in the present work were tested. Their usability for DNA extraction from the sediments of the Kulunda Steppe hypersaline soda lakes was assessed, and the most efficient procedure for indirect (two-stage) DNA extraction was proposed. Almost complete separation of the cell fraction was shown, as well as the inefficiency of nested PCR for analysis of the clone libraries obtained from washed sediments by amplification of the 16S rRNA gene fragments. Analysis of the clone library obtained from the cell fractions of stratified sediments (upper, medium, and lower layers) revealed that in the sediments of Lake Gorchina-3 most eubacterial phylotypes belonged to the class Clostridia, phylum Firmicutes. They were probably specific for this habitatand formed a new, presently unknown high-rank taxon. The data obtained revealed no pronounced stratification of the spe- cies diversity of the eubacterial component of the microbial community inhabiting the sediments (0-20 cm) in the inshore zone of Lake Gorchina-3.
Survival analysis of cervical cancer using stratified Cox regression
Purnami, S. W.; Inayati, K. D.; Sari, N. W. Wulan; Chosuvivatwong, V.; Sriplung, H.
2016-04-01
Cervical cancer is one of the mostly widely cancer cause of the women death in the world including Indonesia. Most cervical cancer patients come to the hospital already in an advanced stadium. As a result, the treatment of cervical cancer becomes more difficult and even can increase the death's risk. One of parameter that can be used to assess successfully of treatment is the probability of survival. This study raises the issue of cervical cancer survival patients at Dr. Soetomo Hospital using stratified Cox regression based on six factors such as age, stadium, treatment initiation, companion disease, complication, and anemia. Stratified Cox model is used because there is one independent variable that does not satisfy the proportional hazards assumption that is stadium. The results of the stratified Cox model show that the complication variable is significant factor which influent survival probability of cervical cancer patient. The obtained hazard ratio is 7.35. It means that cervical cancer patient who has complication is at risk of dying 7.35 times greater than patient who did not has complication. While the adjusted survival curves showed that stadium IV had the lowest probability of survival.
McCauley, Jenna L.; Kilpatrick, Dean G.; Walsh, Kate; Resnick, Heidi S.
2013-01-01
Objective To examine post-rape substance use, associated post rape medical and social concern variables, and past year substance abuse among women reporting having received medical care following a most recent or only lifetime incident of rape. Method Using a subsample of women who received post-rape medical care following a most recent or only rape incident (n=104) drawn from a national household probability sample of U.S. women, the current study described the extent of peritraumatic substance use, past year substance misuse behaviors, post-rape HIV and pregnancy concerns, and lifetime mental health service utilization as a function of substance use at time of incident. Results One-third (33%) of women seeking post-rape medical attention reported consuming alcohol or drugs at the time of their rape incident. Nearly one in four (24.7%) and one in seven (15%) women seeking medical attention following their most recent rape incident endorsed drug (marijuana, illicit, non-medical use of prescription drugs, or club drug) use or met substance abuse criteria, respectively, in the past year. One in twelve (8.4%) women reported at least monthly binge drinking in the past year. Approximately two-thirds of women reported seeking services for mental health needs in their lifetime. Post-rape concerns among women reporting peritraumatic substance use were not significantly different from those of women not reporting such use. Conclusions Substance use was reported by approximately one-third of women and past year substance abuse was common among those seeking post-rape medical care. Implications for service delivery, intervention implementation, and future research are discussed. PMID:23380490
Suppression of stratified explosive interactions
Meeks, M.K.; Shamoun, B.I.; Bonazza, R.; Corradini, M.L. [Wisconsin Univ., Madison, WI (United States). Dept. of Nuclear Engineering and Engineering Physics
1998-01-01
Stratified Fuel-Coolant Interaction (FCI) experiments with Refrigerant-134a and water were performed in a large-scale system. Air was uniformly injected into the coolant pool to establish a pre-existing void which could suppress the explosion. Two competing effects due to the variation of the air flow rate seem to influence the intensity of the explosion in this geometrical configuration. At low flow rates, although the injected air increases the void fraction, the concurrent agitation and mixing increases the intensity of the interaction. At higher flow rates, the increase in void fraction tends to attenuate the propagated pressure wave generated by the explosion. Experimental results show a complete suppression of the vapor explosion at high rates of air injection, corresponding to an average void fraction of larger than 30%. (author)
Dabiri, Sasan; Ghadimi, Fatemeh; Firouzifar, Mohammadreza; Yazdani, Nasrin; Mohammad-Amoli, Mahsa; Vakili, Varasteh; Mahvi, Zahra
2016-07-01
Several lines of evidence support the contribution of autoimmune mechanisms in the pathogenesis of Meniere's disease. The aim of this study was determining the association between HLA-Cw Alleles in patients with definite Meniere's disease and patients with probable Meniere's disease and a control group. HLA-Cw genotyping was performed in 23 patients with definite Meniere's disease, 24 with probable Meniere's disease, and 91 healthy normal subjects, using sequence specific primers polymerase chain reaction technique. The statistical analysis was performed using stata 8 software. There was a significant association between HLA-Cw*04 and HLA-Cw*16 in both definite and probable Meniere's disease compared to normal healthy controls. We observed a significant difference in HLA-Cw*12 frequencies between patients with definite Meniere's disease compared to patients with probable Meniere's disease (P=0.04). The frequency of HLA-Cw*18 is significantly higher in healthy controls (P=0.002). Our findings support the rule of HLA-Cw Alleles in both definite and probable Meniere's disease. In addition, differences in HLA-Cw*12 frequency in definite and probable Meniere's disease in our study's population might indicate distinct immune and inflammatory mechanisms involved in each condition.
Dabiri, Sasan; Ghadimi, Fatemeh; Firouzifar, Mohammadreza; Yazdani, Nasrin; Mohammad-Amoli, Mahsa; Vakili, Varasteh; Mahvi, Zahra
2016-01-01
Introduction Several lines of evidence support the contribution of autoimmune mechanisms in the pathogenesis of Meniere’s disease. The aim of this study was determining the association between HLA-Cw Alleles in patients with definite Meniere’s disease and patients with probable Meniere’s disease and a control group. Materials and Methods: HLA-Cw genotyping was performed in 23 patients with definite Meniere’s disease, 24 with probable Meniere’s disease, and 91 healthy normal subjects, using sequence specific primers polymerase chain reaction technique. The statistical analysis was performed using stata 8 software. Results: There was a significant association between HLA-Cw*04 and HLA-Cw*16 in both definite and probable Meniere’s disease compared to normal healthy controls. We observed a significant difference in HLA-Cw*12 frequencies between patients with definite Meniere’s disease compared to patients with probable Meniere’s disease (P=0.04). The frequency of HLA-Cw*18 is significantly higher in healthy controls (P=0.002). Conclusion: Our findings support the rule of HLA-Cw Alleles in both definite and probable Meniere’s disease. In addition, differences in HLA-Cw*12 frequency in definite and probable Meniere’s disease in our study’s population might indicate distinct immune and inflammatory mechanisms involved in each condition. PMID:27602337
Sasan Dabiri
2016-05-01
Full Text Available Introduction Several lines of evidence support the contribution of autoimmune mechanisms in the pathogenesis of Meniere’s disease. The aim of this study was determining the association between HLA-Cw Alleles in patients with definite Meniere’s disease and patients with probable Meniere’s disease and a control group. Materials and Methods: HLA-Cw genotyping was performed in 23 patients with definite Meniere’s disease, 24 with probable Meniere’s disease, and 91 healthy normal subjects, using sequence specific primers polymerase chain reaction technique. The statistical analysis was performed using stata 8 software. Results: There was a significant association between HLA-Cw*04 and HLA-Cw*16 in both definite and probable Meniere’s disease compared to normal healthy controls. We observed a significant difference in HLA-Cw*12 frequencies between patients with definite Meniere’s disease compared to patients with probable Meniere’s disease (P=0.04. The frequency of HLA-Cw*18 is significantly higher in healthy controls (P=0.002. Conclusion: Our findings support the rule of HLA-Cw Alleles in both definite and probable Meniere’s disease. In addition, differences in HLA-Cw*12 frequency in definite and probable Meniere’s disease in our study’s population might indicate distinct immune and inflammatory mechanisms involved in each condition.
Stratified wake of an accelerating hydrofoil
Ben-Gida, Hadar; Gurka, Roi
2015-01-01
Wakes of towed and self-propelled bodies in stratified fluids are significantly different from non-stratified wakes. Long time effects of stratification on the development of the wakes of bluff bodies moving at constant speed are well known. In this experimental study we demonstrate how buoyancy affects the initial growth of vortices developing in the wake of a hydrofoil accelerating from rest. Particle image velocimetry measurements were applied to characterize the wake evolution behind a NACA 0015 hydrofoil accelerating in water and for low Reynolds number and relatively strong and stably stratified fluid (Re=5,000, Fr~O(1)). The analysis of velocity and vorticity fields, following vortex identification and an estimate of the circulation, reveal that the vortices in the stratified fluid case are stretched along the streamwise direction in the near wake. The momentum thickness profiles show lower momentum thickness values for the stratified late wake compared to the non-stratified wake, implying that the dra...
How stratified is mantle convection?
Puster, Peter; Jordan, Thomas H.
1997-04-01
We quantify the flow stratification in the Earth's mid-mantle (600-1500 km) in terms of a stratification index for the vertical mass flux, Sƒ (z) = 1 - ƒ(z) / ƒref (z), in which the reference value ƒref(z) approximates the local flux at depth z expected for unstratified convection (Sƒ=0). Although this flux stratification index cannot be directly constrained by observations, we show from a series of two-dimensional convection simulations that its value can be related to a thermal stratification index ST(Z) defined in terms of the radial correlation length of the temperature-perturbation field δT(z, Ω). ST is a good proxy for Sƒ at low stratifications (SƒUniformitarian Principle. The bound obtained here from global tomography is consistent with local seismological evidence for slab flux into the lower mantle; however, the total material flux has to be significantly greater (by a factor of 2-3) than that due to slabs alone. A stratification index, Sƒ≲0.2, is sufficient to exclude many stratified convection models still under active consideration, including most forms of chemical layering between the upper and lower mantle, as well as the more extreme versions of avalanching convection governed by a strong endothermic phase change.
Methodology series module 5: Sampling strategies
Maninder Singh Setia
2016-01-01
Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.
Methodology Series Module 5: Sampling Strategies.
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.
Core science: Stratified by a sunken impactor
Nakajima, Miki
2016-10-01
There is potential evidence for a stratified layer at the top of the Earth's core, but its origin is not well understood. Laboratory experiments suggest that the stratified layer could be a sunken remnant of the giant impact that formed the Moon.
A Fixpoint Semantics for Stratified Databases
沈一栋
1993-01-01
Przmusinski extended the notion of stratified logic programs,developed by Apt,Blair and Walker,and by van Gelder,to stratified databases that allow both negative premises and disjunctive consequents.However,he did not provide a fixpoint theory for such class of databases.On the other hand,although a fixpoint semantics has been developed by Minker and Rajasekar for non-Horn logic programs,it is tantamount to traditional minimal model semantics which is not sufficient to capture the intended meaning of negation in the premises of clauses in stratified databases.In this paper,a fixpoint approach to stratified databases is developed,which corresponds with the perfect model semantics.Moreover,algorithms are proposed for computing the set of perfect models of a stratified database.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
孙道德
2003-01-01
Basing on the multi nomial distribution and its nature, the paper conducts analysis of method of non-equal probability classified cluster sample survey and the un-bias estimation of mean, and further researches and provides the deviation square of the estimation and the un-bias estimation of the deviation square.
Thompson, Steven K
2012-01-01
Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Towards Cost-efficient Sampling Methods
Peng, Luo; Chong, Wu
2014-01-01
The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and selects the high degree nodes with higher probability by classifying the nodes according to their degree distribution. The second sampling method improves the existing snowball sampling method so that it enables to sample the targeted nodes selectively in every sampling step. Besides, the two proposed sampling methods not only sample the nodes but also pick the edges directly connected to these nodes. In order to demonstrate the two methods' availability and accuracy, we compare them with the existing sampling methods in...
Sample size estimation and sampling techniques for selecting a representative sample
Aamir Omair
2014-01-01
Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Nichols, J. D.; Gialdini, M.; Jaakkola, S.
1974-01-01
A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
LTRMP Fisheries Data - Stratified Random and Fixed Site Sampling
U.S. Geological Survey, Department of the Interior — The Long Term Resource Monitoring Programs (LTRMP) annual fish monitoring began on the Upper Mississippi and Illinois Rivers in 1989. During the first two years...
Prototypic Features of Loneliness in a Stratified Sample of Adolescents
Lasgaard, Mathias; Elklit, Ask
2009-01-01
Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics. A question......Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics...... loneliness independent of personality characteristics, demographics and social desirability. The study indicates that dysfunctional strategies and attributions in affiliative situations are directly related to loneliness in adolescence. These strategies and attributions may preclude lonely adolescents from...... guidance and intervention. Thus, professionals need to be knowledgeable about prototypic features of loneliness in addition to employing a pro-active approach when assisting adolescents who display prototypic features....
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
Vazsonyi, Alexander T; Harris, Charlene; Terveer, Agnes M; Pagava, Karaman; Phagava, Helen; Michaud, Pierre-Andre
2015-02-01
Previous research has documented the importance of parenting on adolescent health and well-being; however, some of the underlying mechanisms that link the quality of parent-child relationship to health, adjustment, and well-being are not clearly understood. The current study seeks to address this gap by examining the extent to which sleep functioning mediates the effects by parental warmth on different measures of adolescent problem behaviors. Specifically, we test whether sleep functioning, operationalized by sleep quality and sleep quantity, mediates the relationship between the parental warmth and three measures of problem behaviors, namely alcohol use, illegal drug use, and deviance, in two nationally representative samples of Georgian (N = 6,992; M = 15.83, 60% females, and Swiss (N = 5,575; M = 17.17, 50% females) adolescents. Based on tests for parallel mediating effects by sleep functioning of parental warmth on problem behaviors in the MEDIATE macro in SPSS, the findings provided evidence that both sleep quality and sleep quantity independently and cumulatively mediated the effects of parental warmth on each of the three problem behaviors in both samples, with one exception. These results highlight the salience of positive parenting on sleep functioning among teens in two different cultural contexts, and, in turn, on measures of problem behaviors.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
SAS procedures for designing and analyzing sample surveys
Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.
2003-01-01
Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).
Thermals in stratified regions of the ISM
Rodriguez-Gonzalez, Ary
2013-01-01
We present a model of a "thermal" (i.e., a hot bubble) rising within an exponentially stratified region of the ISM. This model includes terms representing the ram pressure braking and the entrainment of environmental gas into the thermal. We then calibrate the free parameters associated with these two terms through a comparison with 3D numerical simulations of a rising bubble. Finally, we apply our "thermal" model to the case of a hot bubble produced by a SN within the stratified ISM of the Galactic disk.
On Stratified Vortex Motions under Gravity.
2014-09-26
AD-A156 930 ON STRATIFIED VORTEX MOTIONS UNDER GRAVITY (U) NAVAL i/i RESEARCH LAB WASHINGTON DC Y T FUNG 20 JUN 85 NRL-MIR-5564 UNCLASSIFIED F/G 20/4...Under Gravity LCn * Y. T. Fung Fluid Dynamics Branch - Marine Technologyv Division June 20, 1985 SO Cyk. NAVAL RESEARCH LABORATORY Washington, D.C...DN880-019 TITLE (Include Security Classification) On Stratified Vortex Motions Under Gravity 12 PERSONAL AUTHOR(S) Funa, Y.T. 13a. TYPE OF REPORT 13b
Mixing by microorganisms in stratified fluids
Wagner, Gregory L; Lauga, Eric
2014-01-01
We examine the vertical mixing induced by the swimming of microorganisms at low Reynolds and P\\'eclet numbers in a stably stratified ocean, and show that the global contribution of oceanic microswimmers to vertical mixing is negligible. We propose two approaches to estimating the mixing efficiency, $\\eta$, or the ratio of the rate of potential energy creation to the total rate-of-working on the ocean by microswimmers. The first is based on scaling arguments and estimates $\\eta$ in terms of the ratio between the typical organism size, $a$, and an intrinsic length scale for the stratified flow, $\\ell = \\left ( \
THERMALS IN STRATIFIED REGIONS OF THE ISM
A. Rodríguez-González
2013-01-01
Full Text Available We present a model of a “thermal” (i.e., a hot bubble rising within an exponentially stratified region of the ISM. This model includes terms representing the ram pressure braking and the entrainment of environmental gas into the thermal. We then calibrate the free parameters associated with these two terms through a comparison with 3D numerical simulations of a rising bubble. Finally, we apply our “thermal” model to the case of a hot bubble produced by a SN within the stratified ISM of the Galactic disk.
Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs
Faqir Muhammad
2007-01-01
Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.
Turbulent Mixing in Stably Stratified Flows
2008-03-01
Liege Colloquium on Ocean Hydrodynamics, volume 46, page 19889898. Elsevier, 1987. R. M. Kerr. Higher-order derivative correlations and the alignment of...19th International Liege Colloquium on Ocean Hydrodynamics, volume 46, pages 3-9. Elsevier, 1988. P. Meunier and G. Spedding. Stratified propelled
Nitrogen transformations in stratified aquatic microbial ecosystems
Revsbech, Niels Peter; Risgaard-Petersen, N.; Schramm, Andreas
2006-01-01
Abstract New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Methane metabolism in a stratified boreal lake
Nykänen, Hannu; Peura, Sari; Kankaala, Paula; Jones, Roger
2013-04-01
Stratified lakes, typical of the boreal zone, are naturally anoxic from their bottoms. In these lakes methanogenesis can account for up to half of organic matter degradation. However, a major part of the methane (CH4) is oxidized in the water column before reaching the atmosphere. Since methanotrophs use CH4 as their sole carbon and energy source, much CH4-derived carbon is incorporated into their biomass. Microbially produced CH4 has strongly negative δ13C compared to other carbon forms in ecosystems, making it possible to follow its route in food webs. However, only a few studies have estimated the amount of this microbial biomass or its carbon stable isotopic composition due to difficulties in separating it from other biomass or from other carbon forms in the water column. We estimated methanotrophic biomass from measured CH4 oxidation, and δ13C of the biomass from measured δ13C values of CH4, DIC, POM and DOC. An estimate of the fraction of methanotrophs in total microbial biomass is derived from bacterial community composition measurements. The study was made in, Alinen Mustajärvi, a small (area 0.75 ha, maximum depth 6.5 m, mean depth 4.2 m,), oligotrophic, mesohumic headwater lake located in boreal coniferous forest in southern Finland. CH4 and DIC concentrations and their δ13C were measured over the deepest point of the lake at 1 m intervals. 13C of DOM and POM were analyzed from composite samples from epi-, meta-, and hypolimnion. Evasion of CH4 and carbon dioxide from the lake surface to the atmosphere was estimated with boundary layer diffusion equations. CH4oxidation was estimated by comparing differences between observed concentrations and CH4potentially transported by turbulent diffusion between different vertical layers in the lake and also by actual methanotrophy measurements and from vertical differences in δ13C-CH4. The estimate of CH4 production was based on the sum of oxidized and released CH4. Molecular microbiology methods were used to
Robust risk prediction with biomarkers under two-phase stratified cohort design.
Payne, Rebecca; Yang, Ming; Zheng, Yingye; Jensen, Majken K; Cai, Tianxi
2016-12-01
Identification of novel biomarkers for risk prediction is important for disease prevention and optimal treatment selection. However, studies aiming to discover which biomarkers are useful for risk prediction often require the use of stored biological samples from large assembled cohorts, and thus the depletion of a finite and precious resource. To make efficient use of such stored samples, two-phase sampling designs are often adopted as resource-efficient sampling strategies, especially when the outcome of interest is rare. Existing methods for analyzing data from two-phase studies focus primarily on single marker analysis or fitting the Cox regression model to combine information from multiple markers. However, the Cox model may not fit the data well. Under model misspecification, the composite score derived from the Cox model may not perform well in predicting the outcome. Under a general two-phase stratified cohort sampling design, we present a novel approach to combining multiple markers to optimize prediction by fitting a flexible nonparametric transformation model. Using inverse probability weighting to account for the outcome-dependent sampling, we propose to estimate the model parameters by maximizing an objective function which can be interpreted as a weighted C-statistic for survival outcomes. Regardless of model adequacy, the proposed procedure yields a sensible composite risk score for prediction. A major obstacle for making inference under two phase studies is due to the correlation induced by the finite population sampling, which prevents standard inference procedures such as the bootstrap from being used for variance estimation. We propose a resampling procedure to derive valid confidence intervals for the model parameters and the C-statistic accuracy measure. We illustrate the new methods with simulation studies and an analysis of a two-phase study of high-density lipoprotein cholesterol (HDL-C) subtypes for predicting the risk of coronary heart
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Luciane Martins Borowsky
2007-09-01
Full Text Available The quantification of microorganisms present in food samples is an important step to assess the risk to consumers. The number of salmonellae in 20 positive samples of minced pork was determined by the Most Probable Number method. Counts ranging between A identificação de risco inclui uma etapa de quantificação do microrganismo no produto avaliado. O número de Salmonella sp. em 20 amostras de carne suína moída previamente identificadas como positivas foi determinado pelo método do Número mais Provável. Contagens entre <3 e 240 ufc.g-1 foram encontradas nas amostras analisadas.
The Risk-Stratified Osteoporosis Strategy Evaluation study (ROSE)
Rubin, Katrine Hass; Holmberg, Teresa; Rothmann, Mette Juel
2015-01-01
The risk-stratified osteoporosis strategy evaluation study (ROSE) is a randomized prospective population-based study investigating the effectiveness of a two-step screening program for osteoporosis in women. This paper reports the study design and baseline characteristics of the study population....... 35,000 women aged 65-80 years were selected at random from the population in the Region of Southern Denmark and-before inclusion-randomized to either a screening group or a control group. As first step, a self-administered questionnaire regarding risk factors for osteoporosis based on FRAX......(®) was issued to both groups. As second step, subjects in the screening group with a 10-year probability of major osteoporotic fractures ≥15 % were offered a DXA scan. Patients diagnosed with osteoporosis from the DXA scan were advised to see their GP and discuss pharmaceutical treatment according to Danish...
Ross, Kenneth N.
1987-01-01
This article considers various kinds of probability and non-probability samples in both experimental and survey studies. Throughout, how a sample is chosen is stressed. Size alone is not the determining consideration in sample selection. Good samples do not occur by accident; they are the result of a careful design. (Author/JAZ)
Drainage in a model stratified porous medium
Datta, Sujit S; 10.1209/0295-5075/101/14002
2013-01-01
We show that when a non-wetting fluid drains a stratified porous medium at sufficiently small capillary numbers Ca, it flows only through the coarsest stratum of the medium; by contrast, above a threshold Ca, the non-wetting fluid is also forced laterally, into part of the adjacent, finer strata. The spatial extent of this partial invasion increases with Ca. We quantitatively understand this behavior by balancing the stratum-scale viscous pressure driving the flow with the capillary pressure required to invade individual pores. Because geological formations are frequently stratified, we anticipate that our results will be relevant to a number of important applications, including understanding oil migration, preventing groundwater contamination, and sub-surface CO$_{2}$ storage.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Stably Stratified Flow in a Shallow Valley
Mahrt, L.
2017-01-01
Stratified nocturnal flow above and within a small valley of approximately 12-m depth and a few hundred metres width is examined as a case study, based on a network of 20 sonic anemometers and a central 20-m tower with eight levels of sonic anemometers. Several regimes of stratified flow over gentle topography are conceptually defined for organizing the data analysis and comparing with the existing literature. In our case study, a marginal cold pool forms within the shallow valley in the early evening but yields to larger ambient wind speeds after a few hours, corresponding to stratified terrain-following flow where the flow outside the valley descends to the valley floor. The terrain-following flow lasts about 10 h and then undergoes transition to an intermittent marginal cold pool towards the end of the night when the larger-scale flow collapses. During this 10-h period, the stratified terrain-following flow is characterized by a three-layer structure, consisting of a thin surface boundary layer of a few metres depth on the valley floor, a deeper boundary layer corresponding to the larger-scale flow, and an intermediate transition layer with significant wind-directional shear and possible advection of lee turbulence that is generated even for the gentle topography of our study. The flow in the valley is often modulated by oscillations with a typical period of 10 min. Cold events with smaller turbulent intensity and duration of tens of minutes move through the observational domain throughout the terrain-following period. One of these events is examined in detail.
Multi Dimensional CTL and Stratified Datalog
Theodore Andronikos
2010-02-01
Full Text Available In this work we define Multi Dimensional CTL (MD-CTL in short by extending CTL which is thedominant temporal specification language in practice. The need for Multi Dimensional CTL is mainlydue to the advent of semi-structured data. The common path nature of CTL and XPath which provides asuitable model for semi-structured data, has caused the emergence of work on specifying a relation amongthem aiming at exploiting the nice properties of CTL. Although the advantages of such an approach havealready been noticed [36, 26, 5], no formal definition of MD-CTL has been given. The goal of this workis twofold; a we define MD-CTL and prove that the “nice” properties of CTL (linear model checking andbounded model property transfer also to MD-CTL, b we establish new results on stratified Datalog. Inparticular, we define a fragment of stratified Datalog called Multi Branching Temporal (MBT in shortprograms that has the same expressive power as MD-CTL. We prove that by devising a linear translationbetween MBT and MD-CTL. We actually give the exact translation rules for both directions. We furtherbuild on this relation to prove that query evaluation is linear and checking satisfiability, containment andequivalence are EXPTIME–complete for MBT programs. The class MBT is the largest fragment of stratifiedDatalog for which such results exist in the literature.
Thermal mixing in a stratified environment
Kraemer, Damian; Cotel, Aline
1999-11-01
Laboratory experiments of a thermal impinging on a stratified interface have been performed. The thermal was released from a cylindrical reservoir located at the bottom of a Lucite tank. The stratified interface was created by filling the tank with two different saline solutions. The density of the lower layer is greater than that of the upper layer and the thermal fluid, thereby creating a stable stratification. A pH indicator, phenolphthalein, is used to visualize and quantify the amount of mixing produced by the impingement of the thermal at the interface. The upper layer contains a mixture of water, salt and sodium hydroxide. The thermal fluid is composed of water, sulfuric acid and phenolphthalein. When the thermal entrains and mixes fluid from the upper layer, a chemical reaction takes place, and the resulting mixed fluid is now visible. The ratio of base to acid, called the equivalence ratio, was varied throughout the experiments, as well as the Richardson number. The Richardson number is the ratio of potential to kinetic energy, and is based on the thermal quantities at the interface. Results indicate that the amount of mixing produced is proportional to the Richardson number raised to the -3/2 power. Previous experiments (Zhang and Cotel 1999) revealed that the entrainment rate of a thermal in a stratified environment follows the same power law.
STRATIFIED MODEL FOR ESTIMATING FATIGUE CRACK GROWTH RATE OF METALLIC MATERIALS
YANG Yong-yu; LIU Xin-wei; YANG Fan
2005-01-01
The curve of relationship between fatigue crack growth rate and the stress strength factor amplitude represented an important fatigue property in designing of damage tolerance limits and predicting life of metallic component parts. In order to have a morereasonable use of testing data, samples from population were stratified suggested by the stratified random sample model (SRAM). The data in each stratum corresponded to the same experiment conditions. A suitable weight was assigned to each stratified sample according to the actual working states of the pressure vessel, so that the estimation of fatigue crack growth rate equation was more accurate for practice. An empirical study shows that the SRAM estimation by using fatigue crack growth rate data from different stoves is obviously better than the estimation from simple random sample model.
任丽梅; 徐伟; 李战国
2013-01-01
The failure probability is one of the most important reliability measures in structural reliability assessment of dynamical systems. Here, a procedure for estimating failure probabilities of non-linear systems based on the important sampling technique was presented. Firstly ,by using Rice formula,the equivalent linear version of the non-linear systems was derived. Using the equivalent linear equation, the design point of the equivalent linear systems was used to construct control function. Secondly, an important sampling technique was used to estimate the first excursion probabilities for the non-linear system. Finally, a Duffing oscillator was taken for example. The simulation results showed that the proposed method is correct and effective; the number of samples and the computational time are reduced significantly compared with those of direct Monte Carlo simulations.%在结构动力学系统的可靠性分析中,动力学系统的首次穿越失效一直是研究重点问题之一.基于重要抽样法基础上研究了非线性结构动力学系统的首次穿越.首先根据Rice公式,得到与非线性系统方程具有相同平均上穿率的等效线性化系统方程,利用此等效方程得到设计点的解析表达式,并用此解析式来构造控制函数,最后将此控制函数运用到非线性系统中,利用重要抽样法估计非线性系统的首穿失效概率.以Duffing振子为例,模拟结果显示了方法的正确性与有效性,与原始蒙特卡罗模拟方法相比较,样本数量、计算所需时间都有明显减小.
基于多线程的不等概率的随机抽取算法%The Random sample Algorithm of unequal probability Based on Multi Thread
容飞龙
2015-01-01
Random sample of unequal probability based on variable parameters has been widely used in the society,for example the lottery according to the amount of consumption and random play of music player. This paper presents a algorithm of unequal probability based on multi thread. The core of the algorithm is:sleep() of thread in proportion to the variable parameter. This algorithm can meet the needs of a variety of complex.%基于可变参数的不等概率的随机抽取已经在社会上广泛应用，比如根据消费金额的抽奖和音乐播放器的随机播放。本文提出了基于多线程的不等概率随机抽取的算法。该方法的核心在于：线程的sleep()的时间和可变参数成比例。可以满足多种复杂多变的需要。
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Probabilities for Solar Siblings
Valtonen, M; Bobylev, V V; Myllari, A
2015-01-01
We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
The fully nonlinear stratified geostrophic adjustment problem
Coutino, Aaron; Stastna, Marek
2017-01-01
The study of the adjustment to equilibrium by a stratified fluid in a rotating reference frame is a classical problem in geophysical fluid dynamics. We consider the fully nonlinear, stratified adjustment problem from a numerical point of view. We present results of smoothed dam break simulations based on experiments in the published literature, with a focus on both the wave trains that propagate away from the nascent geostrophic state and the geostrophic state itself. We demonstrate that for Rossby numbers in excess of roughly 2 the wave train cannot be interpreted in terms of linear theory. This wave train consists of a leading solitary-like packet and a trailing tail of dispersive waves. However, it is found that the leading wave packet never completely separates from the trailing tail. Somewhat surprisingly, the inertial oscillations associated with the geostrophic state exhibit evidence of nonlinearity even when the Rossby number falls below 1. We vary the width of the initial disturbance and the rotation rate so as to keep the Rossby number fixed, and find that while the qualitative response remains consistent, the Froude number varies, and these variations are manifested in the form of the emanating wave train. For wider initial disturbances we find clear evidence of a wave train that initially propagates toward the near wall, reflects, and propagates away from the geostrophic state behind the leading wave train. We compare kinetic energy inside and outside of the geostrophic state, finding that for long times a Rossby number of around one-quarter yields an equal split between the two, with lower (higher) Rossby numbers yielding more energy in the geostrophic state (wave train). Finally we compare the energetics of the geostrophic state as the Rossby number varies, finding long-lived inertial oscillations in the majority of the cases and a general agreement with the past literature that employed either hydrostatic, shallow-water equation-based theory or
Inverse scattering of dispersive stratified structures
Skaar, Johannes
2012-01-01
We consider the inverse scattering problem of retrieving the structural parameters of a stratified medium consisting of dispersive materials, given knowledge of the complex reflection coefficient in a finite frequency range. It is shown that the inverse scattering problem does not have a unique solution in general. When the dispersion is sufficiently small, such that the time-domain Fresnel reflections have durations less than the round-trip time in the layers, the solution is unique and can be found by layer peeling. Numerical examples with dispersive and lossy media are given, demonstrating the usefulness of the method for e.g. THz technology.
Topological Structures in Rotating Stratified Flows
Redondo, J. M.; Carrillo, A.; Perez, E.
2003-04-01
Detailled 2D Particle traking and PIV visualizations performed on a series of large scale laboratory experiments at the Coriolis Platform of the SINTEF in Trondheim have revealed several resonances which scale on the Strouhal, the Rossby and the Richardson numbers. More than 100 experiments spanned a wide range of Rossby Deformation Radii and the topological structures (Parabolic /Eliptic /Hyperbolic) of the quasi-balanced stratified-rotating flows were studied when stirring (akin to coastal mixing) occured at a side of the tank. The strong asymetry favored by the total vorticity produces a wealth of mixing patterns.
Probability and Relative Frequency
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Hernandez Voth, Ana; Mora Ortega, Gemma; Moreno Zabaleta, Raul; Montoro Zulueta, Javier; Verdugo Cartas, Maria I; Rojo Moreno-Arrones, Blas; Lores Gutierrez, Vanesa; Ramirez Prieto, María T
2016-03-04
Polysomnography (PSG) is the gold standard technic for the diagnosis of obstructive sleep apnea syndrome (OSAS). It is an expensive, complex and not always available technic, meaning that respiratory polygraphy (RP) has become usual. Although RP is not validated in low probability patients, Spanish guidelines recommend conservative treatment in patients with negative RP. We intended to study the prevalence and severity of OSAS through PSG in a sample of patients with low probability and negative RP. Retrospective, observational, descriptive and analytic study of low probability OSAS patients with negative RP in whom a PSG was performed. Anthropometric, clinical and sleep data were collected. Eighty-two patients were included. After PSG, a greater number of hypopneas (137.8±70.1 vs. 51.2±38.4 [P<.05]) and apnea hypopnea index (27.8±15.6 vs. 11.7±7.1 [P<.05]) was observed, as well as an increment in OSAS prevalence of 17%, which was 35% in severe OSAS. In mild OSAS, there was a decrement of 41%. According with the results of this study, RP significantly underestimates the prevalence and severity of OSAS in low probability patients. While it is necessary to adequately stratify the OSAS probability in order to correctly indicate diagnosis tests, we recommend performing a PSG in low probability patients with negative RP. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Corbellini, Luís Gustavo; Júnior, Alfredo Bianco; de Freitas Costa, Eduardo; Duarte, Ana Sofia Ribeiro; Albuquerque, Elenita Ruttscheidt; Kich, Jalusa Deon; Cardoso, Marisa; Nauta, Maarten
2016-07-02
Sources of contamination of carcasses during slaughter include infected pigs as well as environmentally related sources. There are many microbial indicators that can be used in the processing of food to assess food hygiene and the safety of food processing. The presence of some microbial indicators can be viewed as a result of direct or indirect contamination of a food with fecal material. The presence of Enterobacteriaceae is often used as a hygiene indicator, as they are found both in the environment and in the intestine of warm-blooded animals. An association between Salmonella isolation and Enterobacteriaceae count (EC) on pre-chill carcasses has been described, however the impact of slaughterhouse and the day of sampling on the occurrence of Salmonella has not been previously investigated. To this end, mixed logistic regressions (MLRs) with random effects and fixed slopes were performed to assess the change in EC and its correlation with Salmonella occurrence using two data sets. The first describes the EC and Salmonella isolation in 60 pork carcasses in one slaughterhouse sampled at 11 different slaughter steps, including the carcass as a random effect. The second describes the EC and Salmonella isolation on 1150 pre-chill carcasses sampled in 13 slaughterhouses over 230 sampling days, and the model combined two random intercepts, slaughterhouse and date of sampling nested with slaughterhouse (day/slaughterhouse). Statistically significant associations (pSalmonella occurrence were found in all models. Nevertheless, although a strong association was found between Enterobacteriaceae and Salmonella contamination in pork carcasses, this association was not constant, given that there was a high variation in the probability of a carcass being positive for Salmonella according to the EC mainly between days of samples. The effect of the day of sampling on Salmonella prevalence was so large that the predictive value of the EC count for Salmonella isolation on a daily
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.
2015-01-01
We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.
Paula Costa Mosca Macedo
2009-06-01
Full Text Available OBJECTIVE: To evaluate the quality of life during the first three years of training and identify its association with sociodemographicoccupational characteristics, leisure time and health habits. METHOD: A cross-sectional study with a random sample of 128 residents stratified by year of training was conducted. The Medical Outcome Study -short form 36 was administered. Mann-Whitney tests were carried out to compare percentile distributions of the eight quality of life domains, according to sociodemographic variables, and a multiple linear regression analysis was performed, followed by a validity checking for the resulting models. RESULTS: The physical component presented higher quality of life medians than the mental component. Comparisons between the three years showed that in almost all domains the quality of life scores of the second year residents were higher than the first year residents (p OBJETIVO: Avaliar a qualidade de vida do médico residente durante os três anos do treinamento e identificar sua associação com as características sociodemográficas-ocupacionais, tempo de lazer e hábitos de saúde. MÉTODO: Foi realizado um estudo transversal com amostra randomizada de 128 residentes, estratificada por ano de residência. O Medical Outcome Study-Short Form 36 foi aplicado; as distribuições percentis dos domínios de qualidade de vida de acordo com variáveis sociodemográficas foram analisadas pelo teste de Mann-Whitney e regressão linear múltipla, bem como estudo de validação pós-regressão. RESULTADOS: O componente físico da qualidade de vida apresentou medianas mais altas do que o mental. Comparações entre os três anos mostraram que quase todos os domínios de qualidade de vida tiveram escores maiores no segundo do que no primeiro ano (p < 0,01; em relação ao componente mental observamos maiores escores no terceiro ano do que nos demais (p < 0,01. Preditores de maior qualidade de vida foram: estar no segundo ou
Dynamical Simulation of Probabilities
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Stratified growth in Pseudomonas aeruginosa biofilms
Werner, E.; Roe, F.; Bugnicourt, A.;
2004-01-01
In this study, stratified patterns of protein synthesis and growth were demonstrated in Pseudomonas aeruginosa biofilms. Spatial patterns of protein synthetic activity inside biofilms were characterized by the use of two green fluorescent protein (GFP) reporter gene constructs. One construct...... carried an isopropyl-beta-D-thiogalactopyranoside (IPTG)-inducible gfpmut2 gene encoding a stable GFP. The second construct carried a GFP derivative, gfp-AGA, encoding an unstable GFP under the control of the growth-rate-dependent rrnBp(1) promoter. Both GFP reporters indicated that active protein...... of oxygen limitation in the biofilm. Oxygen microelectrode measurements showed that oxygen only penetrated approximately 50 mum into the biofilm. P. aeruginosa was incapable of anaerobic growth in the medium used for this investigation. These results show that while mature P. aeruginosa biofilms contain...
Clustering of floating particles in stratified turbulence
Boffetta, Guido; de Lillo, Filippo; Musacchio, Stefano; Sozza, Alessandro
2016-11-01
We study the dynamics of small floating particles transported by stratified turbulence in presence of a mean linear density profile as a simple model for the confinement and the accumulation of plankton in the ocean. By means of extensive direct numerical simulations we investigate the statistical distribution of floaters as a function of the two dimensionless parameters of the problem. We find that vertical confinement of particles is mainly ruled by the degree of stratification, with a weak dependency on the particle properties. Conversely, small scale fractal clustering, typical of non-neutral particles in turbulence, depends on the particle relaxation time and is only weakly dependent on the flow stratification. The implications of our findings for the formation of thin phytoplankton layers are discussed.
On turbulence in a stratified environment
Sarkar, Sutanu
2015-11-01
John Lumley, motivated by atmospheric observations, made seminal contributions to the statistical theory (Lumley and Panofsky 1964, Lumley 1964) and second-order modeling (Zeman and Lumley 1976) of turbulence in the environment. Turbulent processes in the ocean share many features with the atmosphere, e.g., shear, stratification, rotation and rough topography. Results from direct and large eddy simulations of two model problems will be used to illustrate some of the features of turbulence in a stratified environment. The first problem concerns a shear layer in nonuniform stratification, a situation typical of both the atmosphere and the ocean. The second problem, considered to be responsible for much of the turbulent mixing that occurs in the ocean interior, concerns topographically generated internal gravity waves. Connections will be made to data taken during observational campaigns in the ocean.
董庆利; 宋筱瑜; 丁甜; 刘箐
2016-01-01
This study was designed to verify the effects of pathogen in the negative samples on quantitative microbiological risk assessment ( QMRA) . Previous research on QMRA of Aeromonas spp. in chilled pork was taken as an example,and two scenarios of Aeromonas spp. in the negative samples,zero and maximum value ( detection limit) ,respectively,were simulated in quantitative exposure assessment. The predictive food⁃poison probability of the two scenarios was 33�6% and 69�3%,respectively,and these values were higher than the previous results of 22�1% based on Jarvis function to estimate the possible pathogen distribution in negative samples significantly ( P<0�01) . Moreover,Akaike Information Criterion ( AIC) , Bayesian Information Criterion ( BIC) , X2 , and other parameters were applied for evaluating pathogen in negative sample with different continuous probability distributions. Exponential distribution proved to be better than Logistic, Normal, Triangle and Uniform with AIC values equaling to -41�24 and -135�62 under the two simulated scenarios, respectively, lower than the results of other distributions. In conclusion,pathogen distribution in negative sample should be noted and further optimized during QMRA in future.%探讨不同阴性样品中致病菌污染水平对定量风险评估结果的影响。以冷却猪肉中气单胞菌定量暴露评估为例，设定阴性样品中致病菌为零值和最大值（检测限）2种极端场景，由此预测冷却猪肉中因气单胞菌导致食物中毒的概率分别为33�6％和69�3％，显著高于根据Jarvis经典公式模拟阴性样品的结果（22�1％，P＜0�01）。同时，应用赤池信息量准则（ AIC）、贝叶斯信息准则（ BIC）和卡方检验（ X2）等评价参数，对阴性样品污染水平的不同连续型概率分布进行了比较，表明上述两种极端场景下应用指数分布最优，AIC分别为－41�24和－135�62，低于逻辑、正态、
Stratified scaffold design for engineering composite tissues.
Mosher, Christopher Z; Spalazzi, Jeffrey P; Lu, Helen H
2015-08-01
A significant challenge to orthopaedic soft tissue repair is the biological fixation of autologous or allogeneic grafts with bone, whereby the lack of functional integration between such grafts and host bone has limited the clinical success of anterior cruciate ligament (ACL) and other common soft tissue-based reconstructive grafts. The inability of current surgical reconstruction to restore the native fibrocartilaginous insertion between the ACL and the femur or tibia, which minimizes stress concentration and facilitates load transfer between the soft and hard tissues, compromises the long-term clinical functionality of these grafts. To enable integration, a stratified scaffold design that mimics the multiple tissue regions of the ACL interface (ligament-fibrocartilage-bone) represents a promising strategy for composite tissue formation. Moreover, distinct cellular organization and phase-specific matrix heterogeneity achieved through co- or tri-culture within the scaffold system can promote biomimetic multi-tissue regeneration. Here, we describe the methods for fabricating a tri-phasic scaffold intended for ligament-bone integration, as well as the tri-culture of fibroblasts, chondrocytes, and osteoblasts on the stratified scaffold for the formation of structurally contiguous and compositionally distinct regions of ligament, fibrocartilage and bone. The primary advantage of the tri-phasic scaffold is the recapitulation of the multi-tissue organization across the native interface through the layered design. Moreover, in addition to ease of fabrication, each scaffold phase is similar in polymer composition and therefore can be joined together by sintering, enabling the seamless integration of each region and avoiding delamination between scaffold layers.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
On Quantum Conditional Probability
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Probability, Nondeterminism and Concurrency
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Local properties of countercurrent stratified steam-water flow
Kim, H J
1985-10-01
A study of steam condensation in countercurrent stratified flow of steam and subcooled water has been carried out in a rectangular channel/flat plate geometry over a wide range of inclination angles (4/sup 0/-87/sup 0/) at several aspect ratios. Variables were inlet water and steam flow rates, and inlet water temperature. Local condensation rates and pressure gradients were measured, and local condensation heat transfer coefficients and interfacial shear stress were calculated. Contact probe traverses of the surface waves were made, which allowed a statistical analysis of the wave properties. The local condensation Nusselt number was correlated in terms of local water and steam Reynolds or Froude numbers, as well as the liquid Prandtl number. A turbulence-centered model developed by Theofanous, et al. principally for gas absorption in several geometries, was modified. A correlation for the interfacial shear stress and the pressure gradient agreed with measured values. Mean water layer thicknesses were calculated. Interfacial wave parameters, such as the mean water layer thickness, liquid fraction probability distribution, wave amplitude and wave frequency, are analyzed.
Numerical Study of Stratified Charge Combustion in Wave Rotors
Nalim, M. Razi
1997-01-01
A wave rotor may be used as a pressure-gain combustor effecting non-steady flow, and intermittent, confined combustion to enhance gas turbine engine performance. It will be more compact and probably lighter than an equivalent pressure-exchange wave rotor, yet will have similar thermodynamic and mechanical characteristics. Because the allowable turbine blade temperature limits overall fuel/air ratio to sub-flammable values, premixed stratification techniques are necessary to burn hydrocarbon fuels in small engines with compressor discharge temperature well below autoignition conditions. One-dimensional, unsteady numerical simulations of stratified-charge combustion are performed using an eddy-diffusivity turbulence model and a simple reaction model incorporating a flammability limit temperature. For good combustion efficiency, a stratification strategy is developed which concentrates fuel at the leading and trailing edges of the inlet port. Rotor and exhaust temperature profiles and performance predictions are presented at three representative operating conditions of the engine: full design load, 40% load, and idle. The results indicate that peak local gas temperatures will result in excessive temperatures within the rotor housing unless additional cooling methods are used. The rotor itself will have acceptable temperatures, but the pattern factor presented to the turbine may be of concern, depending on exhaust duct design and duct-rotor interaction.
Analysing stratified medicine business models and value systems: innovation-regulation interactions.
Mittra, James; Tait, Joyce
2012-09-15
Stratified medicine offers both opportunities and challenges to the conventional business models that drive pharmaceutical R&D. Given the increasingly unsustainable blockbuster model of drug development, due in part to maturing product pipelines, alongside increasing demands from regulators, healthcare providers and patients for higher standards of safety, efficacy and cost-effectiveness of new therapies, stratified medicine promises a range of benefits to pharmaceutical and diagnostic firms as well as healthcare providers and patients. However, the transition from 'blockbusters' to what might now be termed 'niche-busters' will require the adoption of new, innovative business models, the identification of different and perhaps novel types of value along the R&D pathway, and a smarter approach to regulation to facilitate innovation in this area. In this paper we apply the Innogen Centre's interdisciplinary ALSIS methodology, which we have developed for the analysis of life science innovation systems in contexts where the value creation process is lengthy, expensive and highly uncertain, to this emerging field of stratified medicine. In doing so, we consider the complex collaboration, timing, coordination and regulatory interactions that shape business models, value chains and value systems relevant to stratified medicine. More specifically, we explore in some depth two convergence models for co-development of a therapy and diagnostic before market authorisation, highlighting the regulatory requirements and policy initiatives within the broader value system environment that have a key role in determining the probable success and sustainability of these models.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Magnetic flux concentrations from turbulent stratified convection
Käpylä, P J; Kleeorin, N; Käpylä, M J; Rogachevskii, I
2015-01-01
(abridged) Context: The mechanisms that cause the formation of sunspots are still unclear. Aims: We study the self-organisation of initially uniform sub-equipartition magnetic fields by highly stratified turbulent convection. Methods: We perform simulations of magnetoconvection in Cartesian domains that are $8.5$-$24$ Mm deep and $34$-$96$ Mm wide. We impose either a vertical or a horizontal uniform magnetic field in a convection-driven turbulent flow. Results: We find that super-equipartition magnetic flux concentrations are formed near the surface with domain depths of $12.5$ and $24$ Mm. The size of the concentrations increases as the box size increases and the largest structures ($20$ Mm horizontally) are obtained in the 24 Mm deep models. The field strength in the concentrations is in the range of $3$-$5$ kG. The concentrations grow approximately linearly in time. The effective magnetic pressure measured in the simulations is positive near the surface and negative in the bulk of the convection zone. Its ...
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
基于重要样本法的结构动力学系统的首次穿越%FIRST EXCURSION PROBABILITIES OF DYNAMICAL SYSTEMS BY IMPORTANCE SAMPLING
任丽梅; 徐伟; 肖玉柱; 王文杰
2012-01-01
基于Gisranov定理,提出一种估计稳态高斯白噪声激励的结构动力学系统首穿失效概率的重要样本法.文章重点是构造控制函数,控制函数促使随机响应尽量集中在样本空间中最易导致首次穿越发生的部分.利用设计点构造控制函数,在线性系统场合,结合时不变系统的结构可靠性理论,通过解有约束的优化问题得到设计点;在非线性系统场合,利用Heonsang Koo提出的设计点激励,通过镜像法得到设计点.最后给出例子,将所提方法与原始蒙特卡罗法相比较,模拟结果显示方法的正确性与有效性.%Based on the Girsanov transformation, this paper develops a method for estimating the first excursion probability of dynamical systems with stationary gauss white noise. The focus is to construct control function that concentrates on the samples paths in the ＂most important part＂ of the sample space, to achieve the purpose of variance reduction. The paper uses design point to construct control function. For linear systems, the present approach combines with the time-invariant structure reliability theory to get design points by solving the problem of the optimization. For non-linear systems, the paper uses mirror-images method to get design points. Finally the paper gives two examples. The results show the method of this paper to be correct and effective by comparing with the primitive Monte Carlo method.
邬明权; 杨良闯; 于博; 王玉; 赵昕; 牛铮; 王长耀
2014-01-01
针对传统抽样调查工作中调查基础资料时效性不高和野外调查工作量较大等问题，该文提出了一种遥感与MPPS（multivariate probability proportional to size）抽样调查相结合的农作物种植面积测量方法。利用第2次农业普查数据进行抽样框的编制；利用时序中分辨率遥感数据进行农作物种植面积的分类；在中分辨率遥感分类图的基础上进行MPPS 抽样；采用高空间分辨率遥感数据对抽选样本进行面向对象的分类；根据MPPS 抽样方法进行总体农作物种植面积的推断；计算CV 值，评价抽样精度，以国家统计局公布数据为标准进行总体面积精度评价。以辽宁省北镇市为研究区对该方法进行了测试。结果显示，该方法能够有效的提取县级农作物种植面积，农作物种植面积提取精度优于92%。%MPPS is a method widely used in crop area statistics in the Chinese crop area statistical investigation business. However, this method has two drawbacks. One is the outdated basic data. The other is the large workload of a field survey. The second land use survey data used as the basic data in the Chinese crop area statistical investigation is only updated every 10 years. The longer update cycle makes it difficult to react to the inter-annual change of crop areas. The artificial field survey is used in the Chinese crop area statistical investigations to survey the area of crops of every sampling village. Because of the large number of sample villages, the workload of field investigation is huge, and time-consuming and laborious. In order to solve those problems in a conditional sampling survey, a novel crop area extraction method was proposed in this paper using remote sensing and MPPS sampling technology. The sampling frame was prepared using the village-level administrative unit data of the second land use survey data. Crops were extracted using multi-temporal HJ-1 satellite data with a
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
THE NUCLEAR ENCOUNTER PROBABILITY
SMULDERS, PJM
1994-01-01
This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability in quantum mechanics
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Stratified spaces constitute a Fra\\"iss\\'e category
Mijares, José Gregorio
2010-01-01
We prove that stratified spaces and stratified pseudomanifolds satisfy categorical Fra\\"{\\i}ss\\'e properties. This result was presented for the First Meeting of Logic and Algebra in Bogot\\'a, on Sept. 2010. This article has been submitted to the Revista Colombiana de Matem\\'aticas.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Gas slug ascent through rheologically stratified conduits
Capponi, Antonio; James, Mike R.; Lane, Steve J.
2016-04-01
Textural and petrological evidence has indicated the presence of viscous, degassed magma layers at the top of the conduit at Stromboli. This layer acts as a plug through which gas slugs burst and it is thought to have a role in controlling the eruptive dynamics. Here, we present the results of laboratory experiments which detail the range of slug flow configurations that can develop in a rheologically stratified conduit. A gas slug can burst (1) after being fully accommodated within the plug volume, (2) whilst its base is still in the underlying low-viscosity liquid or (3) within a low-viscosity layer dynamically emplaced above the plug during the slug ascent. We illustrate the relevance of the same flow configurations at volcanic-scale through a new experimentally-validated 1D model and 3D computational fluid dynamic simulations. Applied to Stromboli, our results show that gas volume, plug thickness, plug viscosity and conduit radius control the transition between each configuration; in contrast, the configuration distribution seems insensitive to the viscosity of magma beneath the plug, which acts mainly to deliver the slug into the plug. Each identified flow configuration encompasses a variety of processes including dynamic narrowing and widening of the conduit, generation of instabilities along the falling liquid film, transient blockages of the slug path and slug break-up. All these complexities, in turn, lead to variations in the slug overpressure, mirrored by changes in infrasonic signatures which are also associated to different eruptive styles. Acoustic amplitudes are strongly dependent on the flow configuration in which the slugs burst, with both acoustic peak amplitudes and waveform shapes reflecting different burst dynamics. When compared to infrasonic signals from Stromboli, the similarity between real signals and laboratory waveforms suggests that the burst of a slug through a plug may represent a viable first-order mechanism for the generation of
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Hameed, Omar; Humphrey, Peter A
2006-07-01
Typically glands of prostatic adenocarcinoma have a single cell lining, although stratification can be seen in invasive carcinomas with a cribriform architecture, including ductal carcinoma. The presence and diagnostic significance of stratified cells within non-cribriform carcinomatous prostatic glands has not been well addressed. The histomorphological features and immunohistochemical profile of cases of non-cribriform prostatic adenocarcinoma with stratified malignant glandular epithelium were analyzed. These cases were identified from needle biopsy cases from the consultation files of one of the authors and from a review of 150 consecutive in-house needle biopsy cases of prostatic adenocarcinoma. Immunohistochemistry was performed utilizing antibodies reactive against high molecular weight cytokeratin (34betaE12), p63 and alpha-methylacyl-coenzyme-A racemase (AMACR). A total of 8 cases were identified, including 2 from the 150 consecutive in-house cases (1.3%). In 4 cases, the focus with glands having stratified epithelium was the sole carcinomatous component in the biopsy, while such a component represented 5-30% of the invasive carcinoma seen elsewhere in the remaining cases. The main attribute in all these foci was the presence of glandular profiles lined by several layers of epithelial cells with cytological and architectural features resembling flat or tufted high-grade prostatic intraepithelial neoplasia, but lacking basal cells as confirmed by negative 34betaE12 and/or p63 immunostains in all cases. The AMACR staining profile of the stratified foci was variable, with 4 foci showing positivity, and 3 foci being negative, including two cases that displayed AMACR positivity in adjacent non-stratified prostatic adenocarcinoma. Prostatic adenocarcinoma with stratified malignant glandular epithelium can be identified in prostate needle biopsy samples harboring non-cribriform prostatic adenocarcinoma and resembles glands with high-grade prostatic
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
The Universal Aspect Ratio of Vortices in Rotating Stratifi?ed Flows: Experiments and Observations
Aubert, Oriane; Gal, Patrice Le; Marcus, Philip S
2012-01-01
We validate a new law for the aspect ratio $\\alpha = H/L$ of vortices in a rotating, stratified flow, where $H$ and $L$ are the vertical half-height and horizontal length scale of the vortices. The aspect ratio depends not only on the Coriolis parameter f and buoyancy (or Brunt-Vaisala) frequency $\\bar{N}$ of the background flow, but also on the buoyancy frequency $N_c$ within the vortex and on the Rossby number $Ro$ of the vortex such that $\\alpha = f \\sqrt{[Ro (1 + Ro)/(N_c^2- \\bar{N}^2)]}$. This law for $\\alpha$ is obeyed precisely by the exact equilibrium solution of the inviscid Boussinesq equations that we show to be a useful model of our laboratory vortices. The law is valid for both cyclones and anticyclones. Our anticyclones are generated by injecting fluid into a rotating tank filled with linearly-stratified salt water. The vortices are far from the top and bottom boundaries of the tank, so there is no Ekman circulation. In one set of experiments, the vortices viscously decay, but as they do, they c...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Sampling Theory of Food Safety System
LI, Bing; Chen, Guohua; Zhu, Ning
2009-01-01
We introduce the stratified sampling method, and put forward theoretical unbiased estimation of stratified sampling program, as well as the model and statistics of experimental design test. We also discuss the establishment of dietary exposure model, pollutant distribution model, and risk evaluation model. Finally, we present some methods for sampling design in China.
Sampling Theory of Food Safety System
Li, Bing; Chen, Guohua; Zhu, Ning
2009-01-01
We introduce the stratified sampling method, and put forward theoretical unbiased estimation of stratified sampling program, as well as the model and statistics of experimental design test. We also discuss the establishment of dietary exposure model, pollutant distribution model, and risk evaluation model. Finally, we present some methods for sampling design in China.
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Contributions to quantum probability
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Tangling clustering instability for small particles in temperature stratified turbulence
Elperin, Tov; Liberman, Michael; Rogachevskii, Igor
2013-01-01
We study particle clustering in a temperature stratified turbulence with small finite correlation time. It is shown that the temperature stratified turbulence strongly increases the degree of compressibility of particle velocity field. This results in the strong decrease of the threshold for the excitation of the tangling clustering instability even for small particles. The tangling clustering instability in the temperature stratified turbulence is essentially different from the inertial clustering instability that occurs in non-stratified isotropic and homogeneous turbulence. While the inertial clustering instability is caused by the centrifugal effect of the turbulent eddies, the mechanism of the tangling clustering instability is related to the temperature fluctuations generated by the tangling of the mean temperature gradient by the velocity fluctuations. Temperature fluctuations produce pressure fluctuations and cause particle clustering in regions with increased pressure fluctuations. It is shown that t...
Effects of rotation on turbulent buoyant plumes in stratified environments
Fabregat Tomàs, Alexandre; Poje, Andrew C; Özgökmen, Tamay M; Dewar, William K
2016-01-01
We numerically investigate the effects of rotation on the turbulent dynamics of thermally driven buoyant plumes in stratified environments at the large Rossby numbers characteristic of deep oceanic releases...
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Numerical Study on Saltwater Instrusion in a Heterogeneous Stratified Aquifer
2000-01-01
In a costal aquifer, saltwater intrusion is frequently observed due to an excess exploitation. There are many researches focused on the saltwater intrusion. However, there are few researches, which take into consideration the mixing processes in a stratified heterogeneous aquifer. In the present study, a laboratory experiment and numerical simulation are made in order to understand the phenomena in a stratified heterogeneous aquifer. The result of the numerical analysis agrees well with the m...
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Deepak Swami; P K Sharma; C S P Ojha
2014-12-01
In this paper, we have studied the behaviour of reactive solute transport through stratified porous medium under the influence of multi-process nonequilibrium transport model. Various experiments were carried out in the laboratory and the experimental breakthrough curves were observed at spatially placed sampling points for stratified porous medium. Batch sorption studies were also performed to estimate the sorption parameters of the material used in stratified aquifer system. The effects of distance dependent dispersion and tailing are visible in the experimental breakthrough curves. The presence of physical and chemical non-equilibrium are observed from the pattern of breakthrough curves. Multi-process non-equilibrium model represents the combined effect of physical and chemical non-ideality in the stratified aquifer system. The results show that the incorporation of distance dependent dispersivity in multi-process non-equilibrium model provides best fit of observed data through stratified porous media. Also, the exponential distance dependent dispersivity is more suitable for large distances and at small distances, linear or constant dispersivity function can be considered for simulating reactive solute in stratified porous medium.
Assessing iron dynamics in the release from a stratified reservoir
Ashby, S.L.; Faulkner, S.P.; Gambrell, R.P.; Smith, B.A.
2004-01-01
Field and laboratory studies were conducted to describe the fate of total, dissolved, and ferrous (Fe2.) iron in the release from a stratified reservoir with an anoxic hypolimnion. Concentrations of total iron in the tail water indicated a first order removal process during a low flow release (0.6 m3sec1), yet negligible loss was observed during a period of increased discharge (2.8 m 3 sec-1). Dissolved and ferrous iron concentrations in the tailwater were highly variable during both release regimes and did not follow responses based on theoretical predictions. Ferrous iron concentrations in unfiltered samples were consistently greater than concentrations observed in samples filtered separately through 0.4, 0.2, and 0.1 ??m filters. Total iron removal in laboratory studies followed first order kinetics, but was twice that rate (0.077 mg.L-1 .hr 1) observed during low flow discharge in the tailwater (0.036 mg. L1 .hr1). Dissolved and ferrous iron losses in laboratory studies were rapid (???75% in the first 15 minutes and 95% within 1 hour), followed theoretical predictions, and were much faster than observations in the tailwater (???30% within the first hour). The presence of particulate forms of ferrous iron in the field and differences in removal rates observed in field and laboratory studies indicate a need for improved field assessment techniques and consideration of complexation reactions when assessing the dynamics of iron in reservoir releases and downstream impacts as a result of operation regimes. ?? Copyright by the North American Lake Management Society 2004.
苗志敏; 赵世华; 王颜刚; 李长贵; 王忠超; 陈颖; 陈新焰; 阎胜利
2007-01-01
old in Shandong coastal area.DESIGN: A randomized, stratified cluster sampling survey.SETTING: D
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.
Whiting, Alan B
2014-01-01
Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Generalized Probability Functions
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Data Interpretation: Using Probability
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
Kemp, Alan E. S.; Villareal, Tracy A.
2013-12-01
It is widely held that increased stratification and reduced vertical mixing in the ocean driven by global warming will promote the replacement of diatoms by smaller phytoplankton and lead to an overall decrease in productivity and carbon export. Here we present contrary evidence from a synergy of modern observations and palaeo-records that reveal high diatom production and export from stratified waters. Diatom adaptations to stratified waters include the ability to grow in low light conditions in deep chlorophyll maxima; vertical migrations between nutricline depths and the surface, and symbioses with N2-fixing cyanobacteria in diatom-diazotroph associations (DDA). These strategies foster the maintenance of seed populations that may then exploit mixing events induced by storms or eddies, but may also inherently promote blooms. Recent oceanographic observations in the subtropical gyres, at increasingly high temporal and spatial resolutions, have monitored short-lived but often substantial blooms and export of stratified-adapted diatoms including rhizosolenids and the diazotroph-associated Hemiaulus hauckii. Aggregate formation by such diatoms is common and promotes rapid settling thereby minimizing water column remineralization and optimizing carbon flux. Convergence zones associated with oceanic fronts or mesoscale features may also generate substantial flux of stratified-adapted diatom species. Conventional oceanographic observing strategies and sampling techniques under-represent such activity due to the lack of adequate capability to sample the large sized diatoms and colonies involved, the subsurface location of many of these blooms, their common development in thin global warming. However, the key genera involved in such potential feedbacks are underrepresented in both laboratory and field studies and are poorly represented in models. Our findings suggest that a reappraisal is necessary of the way diatoms are represented as plankton functional types (PFTs) in
Self-regulation of mean flows in strongly stratified sheared turbulence
Salehipour, Hesam; Caulfield, Colm-Cille; Peltier, W. Richard
2016-11-01
We investigate the near-equilibrium state of shear-driven stratified turbulence generated by the breaking of Holmboe wave instability (HWI) and Kelvin-Helmholtz instability (KHI). We discuss DNS analyses associated with HWI under various initial conditions. We analyze the time-dependent distribution of the gradient Richardson number, Rig (z , t) associated with the horizontally-averaged velocity and density fields. We demonstrate that unlike the KHI-induced turbulence, the fully turbulent flow that is generated by HWI is robustly characterized by its high probability of Rig 0 . 2 - 0 . 25 , independent of the strength of the initial stratification and furthermore that the turbulence evolves in a 'near-equilibrium' state. The KHI-induced turbulence may become grossly 'out of equilibrium', however, and therefore decays rapidly when the initial value at the interface, Rig (0 , 0) , is closer to the critical value of 1/4; otherwise as Rig (0 , 0) -> 0 the KHI-induced turbulence is close to a state of equilibrium and hence is much more long-lived. We conjecture that stratified shear turbulence tends to adjust to a state of 'near-equilibrium' with horizontally-averaged flows characterized by a high probability of Rig <= 1 / 4 , and hence sustained turbulence over relatively long times.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Learning unbelievable marginal probabilities
Pitkow, Xaq; Miller, Ken D
2011-01-01
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...
A new scoring system to stratify risk in unstable angina
Salzberg Simón
2003-08-01
Full Text Available Abstract Background We performed this study to develop a new scoring system to stratify different levels of risk in patients admitted to hospital with a diagnosis of unstable angina (UA, which is a complex syndrome that encompasses different outcomes. Many prognostic variables have been described but few efforts have been made to group them in order to enhance their individual predictive power. Methods In a first phase, 473 patients were prospectively analyzed to determine which factors were significantly associated with the in-hospital occurrence of refractory ischemia, acute myocardial infarction (AMI or death. A risk score ranging from 0 to 10 points was developed using a multivariate analysis. In a second phase, such score was validated in a new sample of 242 patients and it was finally applied to the entire population (n = 715. Results ST-segment deviation on the electrocardiogram, age ≥ 70 years, previous bypass surgery and troponin T ≥ 0.1 ng/mL were found as independent prognostic variables. A clear distinction was shown among categories of low, intermediate and high risk, defined according to the risk score. The incidence of the triple end-point was 6 %, 19.2 % and 44.7 % respectively, and the figures for AMI or death were 2 %, 11.4 % and 27.6 % respectively (p Conclusions This new scoring system is simple and easy to achieve. It allows a very good stratification of risk in patients having a clinical diagnosis of UA. They may be divided in three categories, which could be of help in the decision-making process.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...... with popular non-expected utility preference representations that satisfy weak conditions....
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Savage s Concept of Probability
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
Stability of stratified two-phase flows in horizontal channels
Barmak, Ilya; Ullmann, Amos; Brauner, Neima; Vitoshkin, Helen
2016-01-01
Linear stability of stratified two-phase flows in horizontal channels to arbitrary wavenumber disturbances is studied. The problem is reduced to Orr-Sommerfeld equations for the stream function disturbances, defined in each sublayer and coupled via boundary conditions that account also for possible interface deformation and capillary forces. Applying the Chebyshev collocation method, the equations and interface boundary conditions are reduced to the generalized eigenvalue problems solved by standard means of numerical linear algebra for the entire spectrum of eigenvalues and the associated eigenvectors. Some additional conclusions concerning the instability nature are derived from the most unstable perturbation patterns. The results are summarized in the form of stability maps showing the operational conditions at which a stratified-smooth flow pattern is stable. It is found that for gas-liquid and liquid-liquid systems the stratified flow with smooth interface is stable only in confined zone of relatively lo...
Background Oriented Schlieren in a Density Stratified Fluid
Verso, Lilly
2015-01-01
Non-intrusive quantitative fluid density measurements methods are essential in stratified flow experiments. Digital imaging leads to synthetic Schlieren methods in which the variations of the index of refraction are reconstructed computationally. In this study, an important extension to one of these methods, called Background Oriented Schlieren (BOS), is proposed. The extension enables an accurate reconstruction of the density field in stratified liquid experiments. Typically, the experiments are performed by the light source, background pattern, and the camera positioned on the opposite sides of a transparent vessel. The multi-media imaging through air-glass-water-glass-air leads to an additional aberration that destroys the reconstruction. A two-step calibration and image remapping transform are the key components that correct the images through the stratified media and provide non-intrusive full-field density measurements of transparent liquids.
Background oriented schlieren in a density stratified fluid
Verso, Lilly; Liberzon, Alex
2015-10-01
Non-intrusive quantitative fluid density measurement methods are essential in the stratified flow experiments. Digital imaging leads to synthetic schlieren methods in which the variations of the index of refraction are reconstructed computationally. In this study, an extension to one of these methods, called background oriented schlieren, is proposed. The extension enables an accurate reconstruction of the density field in stratified liquid experiments. Typically, the experiments are performed by the light source, background pattern, and the camera positioned on the opposite sides of a transparent vessel. The multimedia imaging through air-glass-water-glass-air leads to an additional aberration that destroys the reconstruction. A two-step calibration and image remapping transform are the key components that correct the images through the stratified media and provide a non-intrusive full-field density measurements of transparent liquids.
SINDA/FLUINT Stratified Tank Modeling for Cryrogenic Propellant Tanks
Sakowski, Barbara
2014-01-01
A general purpose SINDA/FLUINT (S/F) stratified tank model was created to simulate self-pressurization and axial jet TVS; Stratified layers in the vapor and liquid are modeled using S/F lumps.; The stratified tank model was constructed to permit incorporating the following additional features:, Multiple or singular lumps in the liquid and vapor regions of the tank, Real gases (also mixtures) and compressible liquids, Venting, pressurizing, and draining, Condensation and evaporation/boiling, Wall heat transfer, Elliptical, cylindrical, and spherical tank geometries; Extensive user logic is used to allow detailed tailoring - Don't have to rebuilt everything from scratch!!; Most code input for a specific case is done through the Registers Data Block:, Lump volumes are determined through user input:; Geometric tank dimensions (height, width, etc); Liquid level could be input as either a volume percentage of fill level or actual liquid level height
Fuel Burning Rate Model for Stratified Charge Engine
SONG Jin'ou; JIANG Zejun; YAO Chunde; WANG Hongfu
2006-01-01
A zero-dimensional single-zone double-curve model is presented to predict fuel burning rate in stratified charge engines, and it is integrated with GT-Power to predict the overall performance of the stratified charge engines.The model consists of two exponential functions for calculating the fuel burning rate in different charge zones.The model factors are determined by a non-linear curve fitting technique, based on the experimental data obtained from 30 cases in middle and low loads.The results show good agreement between the measured and calculated cylinder pressures,and the deviation between calculated and measured cylinder pressures is less than 5%.The zerodimensional single-zone double-curve model is successful in the combustion modeling for stratified charge engines.
Numerical Simulation on Stratified Flow over an Isolated Mountain Ridge
LI Ling; Shigeo Kimura
2007-01-01
The characteristics of stratified flow over an isolated mountain ridge have been investigated numerically. The two-dimensional model equations, based on the time-dependent Reynolds averaged NavierStokes equations, are solved numerically using an implicit time integration in a fitted body grid arrangement to simulate stratified flow over an isolated ideally bell-shaped mountain. The simulation results are in good agreement with the existing corresponding analytical and approximate solutions. It is shown that for atmospheric conditions where non-hydrostatic effects become dominant, the model is able to reproduce typical flow features. The dispersion characteristics of gaseous pollutants in the stratified flow have also been studied. The dispersion patterns for two typical atmospheric conditions are compared. The results show that the presence of a gravity wave causes vertical stratification of the pollutant concentration and affects the diffusive characteristics of the pollutants.
Stability of stratified two-phase flows in inclined channels
Barmak, Ilya; Ullmann, Amos; Brauner, Neima
2016-01-01
Linear stability of stratified gas-liquid and liquid-liquid plane-parallel flows in inclined channels is studied with respect to all wavenumber perturbations. The main objective is to predict parameter regions in which stable stratified configuration in inclined channels exists. Up to three distinct base states with different holdups exist in inclined flows, so that the stability analysis has to be carried out for each branch separately. Special attention is paid to the multiple solution regions to reveal the feasibility of non-unique stable stratified configurations in inclined channels. The stability boundaries of each branch of steady state solutions are presented on the flow pattern map and are accompanied by critical wavenumbers and spatial profiles of the most unstable perturbations. Instabilities of different nature are visualized by streamlines of the neutrally stable perturbed flows, consisting of the critical perturbation superimposed on the base flow. The present analysis confirms the existence of ...
Linear Inviscid Damping for Couette Flow in Stratified Fluid
Yang, Jincheng
2016-01-01
We study the inviscid damping of Coutte flow with an exponentially stratified density. The optimal decay rates of the velocity field and density are obtained for general perturbations with minimal regularity. For Boussinesq approximation model, the decay rates we get are consistent with the previous results in the literature. We also study the decay rates for the full equations of stratified fluids, which were not studied before. For both models, the decay rates depend on the Richardson number in a very similar way. Besides, we also study the inviscid damping of perturbations due to the exponential stratification when there is no shear.
Bases of Schur algebras associated to cellularly stratified diagram algebras
Bowman, C
2011-01-01
We examine homomorphisms between induced modules for a certain class of cellularly stratified diagram algebras, including the BMW algebra, Temperley-Lieb algebra, Brauer algebra, and (quantum) walled Brauer algebra. We define the `permutation' modules for these algebras, these are one-sided ideals which allow us to study the diagrammatic Schur algebras of Hartmann, Henke, Koenig and Paget. We construct bases of these Schur algebras in terms of modified tableaux. On the way we prove that the (quantum) walled Brauer algebra and the Temperley-Lieb algebra are both cellularly stratified and therefore have well-defined Specht filtrations.
OF PROBABILIT Y SAMPLING TECHNIQUES
pling, Generalization of Probability Sampling, Consistent Estimation ... verse, U is the probability that element u will be included in the sample. The second. 71 order, or pair ..... the ﬁnal sample can be partitioned into non-overlapping networks.
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.;
1996-01-01
discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Probability distributions for magnetotellurics
Stodt, John A.
1982-11-01
Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.
RANDOM VARIABLE WITH FUZZY PROBABILITY
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
van Haren, H.
2015-01-01
The character of turbulent overturns in a weakly stratified deep-sea is investigated in some detail using 144 high-resolution temperature sensors at 0.7 m intervals, starting 5 m above the bottom. A 9-day, 1 Hz sampled record from the 912 m depth flat-bottom (<0.5% bottom-slope) mooring site in the
van Haren, H.
2015-01-01
The character of turbulent overturns in a weakly stratified deep-sea is investigated in some detail using 144 high-resolution temperature sensors at 0.7 m intervals, starting 5 m above the bottom. A 9-day, 1 Hz sampled record from the 912 m depth flat-bottom (<0.5% bottom-slope) mooring site in the
Analysis of photonic band-gap structures in stratified medium
Tong, Ming-Sze; Yinchao, Chen; Lu, Yilong;
2005-01-01
Purpose - To demonstrate the flexibility and advantages of a non-uniform pseudo-spectral time domain (nu-PSTD) method through studies of the wave propagation characteristics on photonic band-gap (PBG) structures in stratified medium Design/methodology/approach - A nu-PSTD method is proposed...
Plane Stratified Flow in a Room Ventilated by Displacement Ventilation
Nielsen, Peter Vilhelm; Nickel, J.; Baron, D. J. G.
2004-01-01
The air movement in the occupied zone of a room ventilated by displacement ventilation exists as a stratified flow along the floor. This flow can be radial or plane according to the number of wall-mounted diffusers and the room geometry. The paper addresses the situations where plane flow...
Bacterial production, protozoan grazing and mineralization in stratified lake Vechten.
Bloem, J.
1989-01-01
The role of heterotrophic nanoflagellates (HNAN, size 2-20 μm) in grazing on bacteria and mineralization of organic matter in stratified Lake Vechten was studied.Quantitative effects of manipulation and fixation on HNAN were checked. Considerable losses were caused by centrifugation, even at low spe
Population dynamics of sinking phytoplankton in stratified waters
Huisman, J.; Sommeijer, B.P.
2002-01-01
We analyze the predictions of a reaction-advection-diffusion model to pinpoint the necessary conditions for bloom development of sinking phytoplanktonspecies in stratified waters. This reveals that there are two parameter windows that can sustain sinking phytoplankton, a turbulence window and atherm
Gravity-induced stresses in stratified rock masses
Amadei, B.; Swolfs, H.S.; Savage, W.Z.
1988-01-01
This paper presents closed-form solutions for the stress field induced by gravity in anisotropic and stratified rock masses. These rocks are assumed to be laterally restrained. The rock mass consists of finite mechanical units, each unit being modeled as a homogeneous, transversely isotropic or isotropic linearly elastic material. The following results are found. The nature of the gravity induced stress field in a stratified rock mass depends on the elastic properties of each rock unit and how these properties vary with depth. It is thermodynamically admissible for the induced horizontal stress component in a given stratified rock mass to exceed the vertical stress component in certain units and to be smaller in other units; this is not possible for the classical unstratified isotropic solution. Examples are presented to explore the nature of the gravity induced stress field in stratified rock masses. It is found that a decrease in rock mass anisotropy and a stiffening of rock masses with depth can generate stress distributions comparable to empirical hyperbolic distributions previously proposed in the literature. ?? 1988 Springer-Verlag.
Dispersion of (light) inertial particles in stratified turbulence
van Aartrijk, M.; Clercx, H.J.H.; Armenio, Vincenzo; Geurts, Bernardus J.; Fröhlich, Jochen
2010-01-01
We present a brief overview of a numerical study of the dispersion of particles in stably stratified turbulence. Three types of particles arc examined: fluid particles, light inertial particles ($\\rho_p/\\rho_f = \\mathcal{O}(1)$) and heavy inertial particles ($\\rho_p/\\rho_f \\gg 1$). Stratification
The dynamics of small inertial particles in weakly stratified turbulence
van Aartrijk, M.; Clercx, H.J.H.
We present an overview of a numerical study on the small-scale dynamics and the large-scale dispersion of small inertial particles in stably stratified turbulence. Three types of particles are examined: fluid particles, light inertial particles (with particle-to-fluid density ratio 1Ͽp/Ͽf25) and
Characterization of Inlet Diffuser Performance for Stratified Thermal Storage
Cimbala, John M.; Bahnfleth, William; Song, Jing
1999-11-01
Storage of sensible heating or cooling capacity in stratified vessels has important applications in central heating and cooling plants, power production, and solar energy utilization, among others. In stratified thermal storage systems, diffusers at the top and bottom of a stratified tank introduce and withdraw fluid while maintaining a stable density gradient and causing as little mixing as possible. In chilled water storage applications, mixing during the formation of the thermocline near an inlet diffuser is the single greatest source of thermal losses. Most stratified chilled water storage tanks are cylindrical vessels with diffusers that are either circular disks that distribute flow radially outward or octagonal rings of perforated pipe that distribute flow both inward and outward radially. Both types produce gravity currents that are strongly influenced by the inlet Richardson number, but the significance of other parameters is not clear. The present investigation considers the dependence of the thermal performance of a perforated pipe diffuser on design parameters including inlet velocity, ambient and inlet fluid temperatures, and tank dimensions for a range of conditions representative of typical chilled water applications. Dimensional analysis is combined with a parametric study using results from computational fluid dynamics to obtain quantitative relationships between design parameters and expected thermal performance.
Diamessis, P.; Gurka, R.; Liberzon, A.
2008-11-01
Proper orthogonal decomposition (POD) is applied to 2-D slices of vorticity and horizontal divergence obtained from the 3-D DNS of the stratified turbulent wake of a towed sphere at Re=5x10^3 and Fr=4. Slices are sampled along the stream-depth (Oxz) and stream-span planes (Oxy) at 231 times during the interval Nt[12,35]. POD was chosen amongst the available statistical tools due to its advantage in characterization of simulated and experimentally measured velocity gradient fields, as previously demonstrated for turbulent boundary layers. In the Oxz planes, at the wake centerline, the higher most energetic modes reveal a structure similar of the structure of late-time stratified wakes. Off-set from centerline, the signature of internal waves in the form of forward-inclined coherent beams extending into the ambient becomes evident. The angle of inclination becomes progressively vertical with increasing POD mode. Lower POD modes on the Oyz planes show a layered structure in the wake core with coherent beams radiating out into the ambient over a broad range of angles. Further insight is provided through the relative energy spectra distribution of the vorticity eigenmodes. POD analysis has provided a statistical description of the geometrical features previously observed in instantaneous flow fields of stratified turbulent wakes.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Probability Constructs in Preschool Education and How they Are Taught
Antonopoulos, Konstantinos; Zacharos, Konstantinos
2013-01-01
The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Conditionals, probability, and belief revision
Voorbraak, F.
1989-01-01
A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called
Assessing Schematic Knowledge of Introductory Probability Theory
Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley
2005-01-01
The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…
Haavelmo's Probability Approach and the Cointegrated VAR
Juselius, Katarina
dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and iden- ti…cation. Speci…cally the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2...
The Art of Probability Assignment
Dimitrov, Vesselin I
2012-01-01
The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Theory of sampling and its application in tissue based diagnosis
Kayser Gian
2009-02-01
Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Study of MRI in Stratified Viscous Plasma Configuration
Carlevaro, Nakia; Renzi, Fabrizio
2016-01-01
We analyze the morphology of the Magneto-rotational Instability (MRI) for a stratified viscous plasma disk configuration in differential rotation, taking into account the so-called corotation theorem for the background profile. In order to select the intrinsic Alfv\\'enic nature of MRI, we deal with an incompressible plasma and we adopt a formulation of the perturbation analysis based on the use of the magnetic flux function as a dynamical variable. Our study outlines, as consequence of the corotation condition, a marked asymmetry of the MRI with respect to the equatorial plane, particularly evident in a complete damping of the instability over a positive critical height on the equatorial plane. We also emphasize how such a feature is already present (although less pronounced) even in the ideal case, restoring a dependence of the MRI on the stratified morphology of the gravitational field.
FC-normal and extended stratified logic program
许道云; 丁德成
2002-01-01
This paper investigates the consistency property of FC-normal logic program and presentsan equivalent deciding condition whether a logic program P is an FC-normal program. The decidingcondition describes the characterizations of FC-normal program. By the Petri-net presentation ofa logic program, the characterizations of stratification of FC-normal program are investigated. Thestratification of FC-normal program motivates us to introduce a new kind of stratification, extendedstratification, over logic program. It is shown that an extended (locally) stratified logic program isan FC-normal program. Thus, an extended (locally) stratified logic program has at least one stablemodel. Finally, we have presented algorithms about computation of consistency property and a fewequivalent deciding methods of the finite FC-normal program.
Turbulent thermal diffusion in strongly stratified turbulence: theory and experiments
Amir, G; Eidelman, A; Elperin, T; Kleeorin, N; Rogachevskii, I
2016-01-01
Turbulent thermal diffusion is a combined effect of the temperature stratified turbulence and inertia of small particles. It causes the appearance of a non-diffusive turbulent flux of particles in the direction of the turbulent heat flux. This non-diffusive turbulent flux of particles is proportional to the product of the mean particle number density and the effective velocity of inertial particles. The theory of this effect has been previously developed only for small temperature gradients and small Stokes numbers (Phys. Rev. Lett. {\\bf 76}, 224, 1996). In this study a generalized theory of turbulent thermal diffusion for arbitrary temperature gradients and Stokes numbers has been developed. The laboratory experiments in the oscillating grid turbulence and in the multi-fan produced turbulence have been performed to validate the theory of turbulent thermal diffusion in strongly stratified turbulent flows. It has been shown that the ratio of the effective velocity of inertial particles to the characteristic ve...
Numerical Simulation of Wakes in a Weakly Stratified Fluid
Rottman, James W; Innis, George E; O'Shea, Thomas T; Novikov, Evgeny
2014-01-01
This paper describes some preliminary numerical studies using large eddy simulation of full-scale submarine wakes. Submarine wakes are a combination of the wake generated by a smooth slender body and a number of superimposed vortex pairs generated by various control surfaces and other body appendages. For this preliminary study, we attempt to gain some insight into the behavior of full-scale submarine wakes by computing separately the evolution the self-propelled wake of a slender body and the motion of a single vortex pair in both a non-stratified and a stratified environment. An important aspect of the simulations is the use of an iterative procedure to relax the initial turbulence field so that turbulent production and dissipation are in balance.
Helicity dynamics in stratified turbulence in the absence of forcing
Rorai, C; Pouquet, A; Mininni, P D
2012-01-01
A numerical study of decaying stably-stratified flows is performed. Relatively high stratification and moderate Reynolds numbers are considered, and a particular emphasis is placed on the role of helicity (velocity-vorticity correlations). The problem is tackled by integrating the Boussinesq equations in a periodic cubical domain using different initial conditions: a non-helical Taylor-Green (TG) flow, a fully helical Beltrami (ABC) flow, and random flows with a tunable helicity. We show that for stratified ABC flows helicity undergoes a substantially slower decay than for unstratified ABC flows. This fact is likely associated to the combined effect of stratification and large scale coherent structures. Indeed, when the latter are missing, as in random flows, helicity is rapidly destroyed by the onset of gravitational waves. A type of large-scale dissipative "cyclostrophic" balance can be invoked to explain this behavior. When helicity survives in the system it strongly affects the temporal energy decay and t...
Axisymmetric modes in vertically stratified self-gravitating discs
Mamatsashvili, George
2010-01-01
We perform linear analysis of axisymmetric vertical normal modes in stratified compressible self-gravitating polytropic discs in the shearing box approximation. We study specific dynamics for subadiabatic, adiabatic and superadiabatic vertical stratifications. In the absence of self-gravity, four well-known principal modes can be identified in a stratified disc: acoustic p-, surface gravity f-, buoyancy g- and inertial r-modes. After characterizing modes in the non-self-gravitating case, we include self-gravity and investigate how it modifies the properties of these modes. We find that self-gravity, to a certain degree, reduces their frequencies and changes the structure of the dispersion curves and eigenfunctions at radial wavelengths comparable to the disc height. Its influence on the basic branch of the r-mode, in the case of subadiabatic and adiabatic stratifications, and on the basic branch of the g-mode, in the case of superadiabatic stratification (which in addition exhibits convective instability), do...
Elementary stratified flows with stability at low Richardson number
Barros, Ricardo [Mathematics Applications Consortium for Science and Industry (MACSI), Department of Mathematics and Statistics, University of Limerick, Limerick (Ireland); Choi, Wooyoung [Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, New Jersey 07102-1982 (United States)
2014-12-15
We revisit the stability analysis for three classical configurations of multiple fluid layers proposed by Goldstein [“On the stability of superposed streams of fluids of different densities,” Proc. R. Soc. A. 132, 524 (1931)], Taylor [“Effect of variation in density on the stability of superposed streams of fluid,” Proc. R. Soc. A 132, 499 (1931)], and Holmboe [“On the behaviour of symmetric waves in stratified shear layers,” Geophys. Publ. 24, 67 (1962)] as simple prototypes to understand stability characteristics of stratified shear flows with sharp density transitions. When such flows are confined in a finite domain, it is shown that a large shear across the layers that is often considered a source of instability plays a stabilizing role. Presented are simple analytical criteria for stability of these low Richardson number flows.
Royle, J. Andrew; Converse, Sarah J.
2014-01-01
Capture–recapture studies are often conducted on populations that are stratified by space, time or other factors. In this paper, we develop a Bayesian spatial capture–recapture (SCR) modelling framework for stratified populations – when sampling occurs within multiple distinct spatial and temporal strata.We describe a hierarchical model that integrates distinct models for both the spatial encounter history data from capture–recapture sampling, and also for modelling variation in density among strata. We use an implementation of data augmentation to parameterize the model in terms of a latent categorical stratum or group membership variable, which provides a convenient implementation in popular BUGS software packages.We provide an example application to an experimental study involving small-mammal sampling on multiple trapping grids over multiple years, where the main interest is in modelling a treatment effect on population density among the trapping grids.Many capture–recapture studies involve some aspect of spatial or temporal replication that requires some attention to modelling variation among groups or strata. We propose a hierarchical model that allows explicit modelling of group or strata effects. Because the model is formulated for individual encounter histories and is easily implemented in the BUGS language and other free software, it also provides a general framework for modelling individual effects, such as are present in SCR models.
Introduction to probability and statistics for science, engineering, and finance
Rosenkrantz, Walter A
2008-01-01
Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye
Experiments on the dryout behavior of stratified debris beds
Leininger, Simon; Kulenovic, Rudi; Laurien, Eckart [Stuttgart Univ. (Germany). Inst. of Nuclear Technology and Energy Systems (IKE)
2015-10-15
In case of a severe accident with loss of coolant and core meltdown a particle bed (debris) can be formed. The removal of decay heat from the debris bed is of prime importance for the bed's long-term coolability to guarantee the integrity of the RPV. In contrast to previous experiments, the focus is on stratified beds. The experiments have pointed out that the bed's coolability is significantly affected.
Computation of mixing in large stably stratified enclosures
Zhao, Haihua
This dissertation presents a set of new numerical models for the mixing and heat transfer problems in large stably stratified enclosures. Basing on these models, a new computer code, BMIX++ (Berkeley mechanistic MIXing code in C++), was developed by Christensen (2001) and the author. Traditional lumped control volume methods and zone models cannot model the detailed information about the distributions of temperature, density, and pressure in enclosures and therefore can have significant errors. 2-D and 3-D CFD methods require very fine grid resolution to resolve thin substructures such as jets, wall boundaries, yet such fine grid resolution is difficult or impossible to provide due to computational expense. Peterson's scaling (1994) showed that stratified mixing processes in large stably stratified enclosures can be described using one-dimensional differential equations, with the vertical transport by free and wall jets modeled using standard integral techniques. This allows very large reductions in computational effort compared to three-dimensional numerical modeling of turbulent mixing in large enclosures. The BMIX++ code was developed to implement the above ideas. The code uses a Lagrangian approach to solve 1-D transient governing equations for the ambient fluid and uses analytical models or 1-D integral models to compute substructures. 1-D transient conduction model for the solid boundaries, pressure computation and opening models are also included to make the code more versatile. The BMIX++ code was implemented in C++ and the Object-Oriented-Programming (OOP) technique was intensively used. The BMIX++ code was successfully applied to different types of mixing problems such as stratification in a water tank due to a heater inside, water tank exchange flow experiment simulation, early stage building fire analysis, stratification produced by multiple plumes, and simulations for the UCB large enclosure experiments. Most of these simulations gave satisfying
A statistical mechanics approach to mixing in stratified fluids
Venaille, A.; Gostiaux, L.; Sommeria, J.
2017-01-01
Predicting how much mixing occurs when a given amount of energy is injected into a Boussinesq fluid is a longstanding problem in stratified turbulence. The huge number of degrees of freedom involved in those processes renders extremely difficult a deterministic approach to the problem. Here we present a statistical mechanics approach yielding prediction for a cumulative, global mixing efficiency as a function of a global Richardson number and the background buoyancy profile.
Corticosteroids and pediatric septic shock outcomes: a risk stratified analysis.
Sarah J Atkinson
Full Text Available The potential benefits of corticosteroids for septic shock may depend on initial mortality risk.We determined associations between corticosteroids and outcomes in children with septic shock who were stratified by initial mortality risk.We conducted a retrospective analysis of an ongoing, multi-center pediatric septic shock clinical and biological database. Using a validated biomarker-based stratification tool (PERSEVERE, 496 subjects were stratified into three initial mortality risk strata (low, intermediate, and high. Subjects receiving corticosteroids during the initial 7 days of admission (n = 252 were compared to subjects who did not receive corticosteroids (n = 244. Logistic regression was used to model the effects of corticosteroids on 28-day mortality and complicated course, defined as death within 28 days or persistence of two or more organ failures at 7 days.Subjects who received corticosteroids had greater organ failure burden, higher illness severity, higher mortality, and a greater requirement for vasoactive medications, compared to subjects who did not receive corticosteroids. PERSEVERE-based mortality risk did not differ between the two groups. For the entire cohort, corticosteroids were associated with increased risk of mortality (OR 2.3, 95% CI 1.3-4.0, p = 0.004 and a complicated course (OR 1.7, 95% CI 1.1-2.5, p = 0.012. Within each PERSEVERE-based stratum, corticosteroid administration was not associated with improved outcomes. Similarly, corticosteroid administration was not associated with improved outcomes among patients with no comorbidities, nor in groups of patients stratified by PRISM.Risk stratified analysis failed to demonstrate any benefit from corticosteroids in this pediatric septic shock cohort.
Dust particle charge distribution in a stratified glow discharge
Sukhinin, Gennady I [Institute of Thermophysics, Siberian Branch, Russian Academy of Sciences, Lavrentyev Ave., 1, Novosibirsk 630090 (Russian Federation); Fedoseev, Alexander V [Institute of Thermophysics, Siberian Branch, Russian Academy of Sciences, Lavrentyev Ave., 1, Novosibirsk 630090 (Russian Federation); Ramazanov, Tlekkabul S [Institute of Experimental and Theoretical Physics, Al Farabi Kazakh National University, Tole Bi, 96a, Almaty 050012 (Kazakhstan); Dzhumagulova, Karlygash N [Institute of Experimental and Theoretical Physics, Al Farabi Kazakh National University, Tole Bi, 96a, Almaty 050012 (Kazakhstan); Amangaliyeva, Rauan Zh [Institute of Experimental and Theoretical Physics, Al Farabi Kazakh National University, Tole Bi, 96a, Almaty 050012 (Kazakhstan)
2007-12-21
The influence of a highly pronounced non-equilibrium characteristic of the electron energy distribution function in a stratified dc glow discharge on the process of dust particle charging in a complex plasma is taken into account for the first time. The calculated particle charge spatial distribution is essentially non-homogeneous and it can explain the vortex motion of particles at the periphery of a dusty cloud obtained in experiments.
Stability of stratified two-phase flows in inclined channels
Barmak, I.; Gelfgat, A. Yu.; Ullmann, A.; Brauner, N.
2016-08-01
Linear stability of the stratified gas-liquid and liquid-liquid plane-parallel flows in the inclined channels is studied with respect to all wavenumber perturbations. The main objective is to predict the parameter regions in which the stable stratified configuration in inclined channels exists. Up to three distinct base states with different holdups exist in the inclined flows, so that the stability analysis has to be carried out for each branch separately. Special attention is paid to the multiple solution regions to reveal the feasibility of the non-unique stable stratified configurations in inclined channels. The stability boundaries of each branch of the steady state solutions are presented on the flow pattern map and are accompanied by the critical wavenumbers and the spatial profiles of the most unstable perturbations. Instabilities of different nature are visualized by the streamlines of the neutrally stable perturbed flows, consisting of the critical perturbation superimposed on the base flow. The present analysis confirms the existence of two stable stratified flow configurations in a region of low flow rates in the countercurrent liquid-liquid flows. These configurations become unstable with respect to the shear mode of instability. It was revealed that in slightly upward inclined flows the lower and middle solutions for the holdup are stable in the part of the triple solution region, while the upper solution is always unstable. In the case of downward flows, in the triple solution region, none of the solutions are stable with respect to the short-wave perturbations. These flows are stable only in the single solution region at low flow rates of the heavy phase, and the long-wave perturbations are the most unstable ones.
Thermal stratification built up in hot water tank with different inlet stratifiers
Dragsted, Janne; Furbo, Simon; Dannemand, Mark
2017-01-01
H is a rigid plastic pipe with holes for each 30 cm. The holes are designed with flaps preventing counter flow into the pipe. The inlet stratifier from EyeCular Technologies ApS is made of a flexible polymer with openings all along the side and in the full length of the stratifier. The flexibility...... in order to elucidate how well thermal stratification is established in the tank with differently designed inlet stratifiers under different controlled laboratory conditions. The investigated inlet stratifiers are from Solvis GmbH & Co KG and EyeCular Technologies ApS. The inlet stratifier from Solvis Gmb...... of the stratifier prevents counterflow. The tests have shown that both types of inlet stratifiers had an ability to create stratification in the test tank under the different test conditions. The stratifier from EyeCular Technologies ApS had a better performance at low flows of 1-2 l/min and the stratifier...
Hidden Variables or Positive Probabilities?
Rothman, T; Rothman, Tony
2001-01-01
Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Stability of stratified two-phase flows in horizontal channels
Barmak, I.; Gelfgat, A.; Vitoshkin, H.; Ullmann, A.; Brauner, N.
2016-04-01
Linear stability of stratified two-phase flows in horizontal channels to arbitrary wavenumber disturbances is studied. The problem is reduced to Orr-Sommerfeld equations for the stream function disturbances, defined in each sublayer and coupled via boundary conditions that account also for possible interface deformation and capillary forces. Applying the Chebyshev collocation method, the equations and interface boundary conditions are reduced to the generalized eigenvalue problems solved by standard means of numerical linear algebra for the entire spectrum of eigenvalues and the associated eigenvectors. Some additional conclusions concerning the instability nature are derived from the most unstable perturbation patterns. The results are summarized in the form of stability maps showing the operational conditions at which a stratified-smooth flow pattern is stable. It is found that for gas-liquid and liquid-liquid systems, the stratified flow with a smooth interface is stable only in confined zone of relatively low flow rates, which is in agreement with experiments, but is not predicted by long-wave analysis. Depending on the flow conditions, the critical perturbations can originate mainly at the interface (so-called "interfacial modes of instability") or in the bulk of one of the phases (i.e., "shear modes"). The present analysis revealed that there is no definite correlation between the type of instability and the perturbation wavelength.
Continuous Dependence on the Density for Stratified Steady Water Waves
Chen, Robin Ming; Walsh, Samuel
2016-02-01
There are two distinct regimes commonly used to model traveling waves in stratified water: continuous stratification, where the density is smooth throughout the fluid, and layer-wise continuous stratification, where the fluid consists of multiple immiscible strata. The former is the more physically accurate description, but the latter is frequently more amenable to analysis and computation. By the conservation of mass, the density is constant along the streamlines of the flow; the stratification can therefore be specified by prescribing the value of the density on each streamline. We call this the streamline density function. Our main result states that, for every smoothly stratified periodic traveling wave in a certain small-amplitude regime, there is an L ∞ neighborhood of its streamline density function such that, for any piecewise smooth streamline density function in that neighborhood, there is a corresponding traveling wave solution. Moreover, the mapping from streamline density function to wave is Lipschitz continuous in a certain function space framework. As this neighborhood includes piecewise smooth densities with arbitrarily many jump discontinues, this theorem provides a rigorous justification for the ubiquitous practice of approximating a smoothly stratified wave by a layered one. We also discuss some applications of this result to the study of the qualitative features of such waves.
Probabilities for separating sets of order statistics.
Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E
2010-04-01
Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Modulation Based on Probability Density Functions
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Marín-Méndez, J J; Borra-Ruiz, M C; Álvarez-Gómez, M J; Soutullo Esperón, C
2017-10-01
ADHD symptoms begin to appear at preschool age. ADHD may have a significant negative impact on academic performance. In Spain, there are no standardized tools for detecting ADHD at preschool age, nor is there data about the incidence of this disorder. To evaluate developmental factors and learning difficulties associated with probable ADHD and to assess the impact of ADHD in school performance. We conducted a population-based study with a stratified multistage proportional cluster sample design. We found significant differences between probable ADHD and parents' perception of difficulties in expressive language, comprehension, and fine motor skills, as well as in emotions, concentration, behaviour, and relationships. Around 34% of preschool children with probable ADHD showed global learning difficulties, mainly in patients with the inattentive type. According to the multivariate analysis, learning difficulties were significantly associated with both delayed psychomotor development during the first 3 years of life (OR: 5.57) as assessed by parents, and probable ADHD (OR: 2.34) CONCLUSIONS: There is a connection between probable ADHD in preschool children and parents' perception of difficulties in several dimensions of development and learning. Early detection of ADHD at preschool ages is necessary to start prompt and effective clinical and educational interventions. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
A Thermodynamical Approach for Probability Estimation
Isozaki, Takashi
2012-01-01
The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.
Correlations and Non-Linear Probability Models
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
A new estimator of the discovery probability.
Favaro, Stefano; Lijoi, Antonio; Prünster, Igor
2012-12-01
Species sampling problems have a long history in ecological and biological studies and a number of issues, including the evaluation of species richness, the design of sampling experiments, and the estimation of rare species variety, are to be addressed. Such inferential problems have recently emerged also in genomic applications, however, exhibiting some peculiar features that make them more challenging: specifically, one has to deal with very large populations (genomic libraries) containing a huge number of distinct species (genes) and only a small portion of the library has been sampled (sequenced). These aspects motivate the Bayesian nonparametric approach we undertake, since it allows to achieve the degree of flexibility typically needed in this framework. Based on an observed sample of size n, focus will be on prediction of a key aspect of the outcome from an additional sample of size m, namely, the so-called discovery probability. In particular, conditionally on an observed basic sample of size n, we derive a novel estimator of the probability of detecting, at the (n+m+1)th observation, species that have been observed with any given frequency in the enlarged sample of size n+m. Such an estimator admits a closed-form expression that can be exactly evaluated. The result we obtain allows us to quantify both the rate at which rare species are detected and the achieved sample coverage of abundant species, as m increases. Natural applications are represented by the estimation of the probability of discovering rare genes within genomic libraries and the results are illustrated by means of two expressed sequence tags datasets.
Understanding Students' Beliefs about Probability.
Konold, Clifford
The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…
Expected utility with lower probabilities
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Varieties of Belief and Probability
D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik
2015-01-01
htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for
Landau-Zener Probability Reviewed
Valencia, C
2008-01-01
We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.
Probability and Statistics: 5 Questions
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Linear Positivity and Virtual Probability
Hartle, J B
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...
Survival probability and ruin probability of a risk model
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.
Anisimov, Vladimir V
2011-01-01
This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed.
Jet-mixing of initially-stratified liquid-liquid pipe flows: experiments and numerical simulations
Wright, Stuart; Ibarra-Hernandes, Roberto; Xie, Zhihua; Markides, Christos; Matar, Omar
2016-11-01
Low pipeline velocities lead to stratification and so-called 'phase slip' in horizontal liquid-liquid flows due to differences in liquid densities and viscosities. Stratified flows have no suitable single point for sampling, from which average phase properties (e.g. fractions) can be established. Inline mixing, achieved by static mixers or jets in cross-flow (JICF), is often used to overcome liquid-liquid stratification by establishing unstable two-phase dispersions for sampling. Achieving dispersions in liquid-liquid pipeline flows using JICF is the subject of this experimental and modelling work. The experimental facility involves a matched refractive index liquid-liquid-solid system, featuring an ETFE test section, and experimental liquids which are silicone oil and a 51-wt% glycerol solution. The matching then allows the dispersed fluid phase fractions and velocity fields to be established through advanced optical techniques, namely PLIF (for phase) and PTV or PIV (for velocity fields). CFD codes using the volume of a fluid (VOF) method are then used to demonstrate JICF breakup and dispersion in stratified pipeline flows. A number of simple jet configurations are described and their dispersion effectiveness is compared with the experimental results. Funding from Cameron for Ph.D. studentship (SW) gratefully acknowledged.
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Transgender Demographics: A Household Probability Sample of US Adults, 2014
Berger, Mitchell B.; Graham, Louis F.; Dalton, Vanessa K.
2017-01-01
Objectives. To estimate the proportion of US adults who identify as transgender and to compare the demographics of the transgender and nontransgender populations. Methods. We conducted a secondary analysis of data from states and territories in the 2014 Behavioral Risk Factor Surveillance System that asked about transgender status. The proportion of adults identified as transgender was calculated from affirmative and negative responses (n = 151 456). We analyzed data with a design-adjusted χ2 test. We also explored differences between male-to-female and nontransgender females and female-to-male and nontransgender males. Results. Transgender individuals made up 0.53% (95% confidence interval = 0.46, 0.61) of the population and were more likely to be non-White (40.0% vs 27.3%) and below the poverty line (26.0% vs 15.5%); as likely to be married (50.5% vs 47.7%), living in a rural area (28.7% vs 22.6%), and employed (54.3% vs 57.7%); and less likely to attend college (35.6% vs 56.6%) compared with nontransgender individuals. Conclusions. Our findings suggest that the transgender population is a racially diverse population present across US communities. Inequalities in the education and socioeconomic status have negative implications for the health of the transgender population. PMID:27997239
Probability of identifying different salmonella serotypes in poultry samples
Recent work has called attention to the unequal competitive abilities of different Salmonella serotypes in standard broth culture and plating media. Such serotypes include Enteritidis and Typhimurium that are specifically targeted in some regulatory and certification programs because they cause a l...
Magnetic Field in the Gravitationally Stratified Coronal Loops
B. N. Dwivedi; A. K. Srivastava
2015-03-01
We study the effect of gravitational stratification on the estimation of magnetic fields in the coronal loops. By using the method of MHD seismology of kink waves for the estimation of magnetic field of coronal loops, we derive a new formula for the magnetic field considering the effect of gravitational stratification. The fast-kink wave is a potential diagnostic tool for the estimation of magnetic field in fluxtubes. We consider the eleven kink oscillation cases observed by TRACE between July 1998 and June 2001. We calculate magnetic field in the stratified loops (str) and compare them with the previously calculated absolute magnetic field (abs). The gravitational stratification efficiently affects the magnetic field estimation in the coronal loops as it affects also the properties of kink waves. We find ≈22% increment in the magnetic field for the smallest ( = 72 Mm) while ≈42% increment in the absolute magnetic field for the longest ( = 406 Mm) coronal loops. The magnetic fields str and abs also increase with the number density, if the loop length does not vary much. The increment in the magnetic field due to gravitational stratification is small at the lower number densities, however, it is large at the higher number densities. We find that damping time of kink waves due to phase-mixing is less in the case of gravitationally stratified loops compared to nonstratified ones. This indicates the more rapid damping of kink waves in the stratified loops. In conclusion, we find that the gravitational stratification efficiently affects the estimation of magnetic field and damping time estimation especially in the longer coronal loops.
Experimental Study of Fluorine Transport Rules in Unsaturated Stratified Soil
ZHANG Hong-mei; SU Bao-yu; LIU Peng-hua; ZHANG Wei
2007-01-01
With the aid of soil column test models, the transport rules of fluorine contaminants in unsaturated stratified soils are discussed. Curves of F- concentrations at different times and sites in the unsaturated stratified soil were obtained under conditions of continuous injection of fluoride contaminants and water. Based on the analysis of the actual observation data, the values between computed results and observed data were compared. It is shown that the chemical properties of fluorine ions are active. The migration process of fluorine ions in soils is complex. Because of the effect of adsorption and desorption, the curve of the fluorine ion breakthrough curve is not symmetric. Its concentration peak value at each measuring point gradually decays. The tail of the breakthrough curve is long and the process of leaching and purifying using water requires considerable time. Along with the release of OHˉ in the process of fluorine absorption, the pH value of the soil solution changed from neutral to alkalinity during the test process. The first part of the breakthrough curve fitted better than the second part. The main reason is that fluorine does not always exist in the form of fluorinions in groundwater. Given the long test time, fluorinions possibly react with other ions in the soil solution to form complex water-soluble fluorine compounds. Only the retardation factor and source-sink term have been considered in our numerical model, which may leads to errors of computed values. But as a whole the migration rules of fluorine ions are basically correct, which indicates that the established numerical model can be used to simulate the transport rules of fluorine contaminants in unsaturated stratified soils.
Stratified spin-up in a sliced, square cylinder
Munro, R. J. [Faculty of Engineering, University of Nottingham, Nottingham NG7 2RD (United Kingdom); Foster, M. R. [Department of Mathematical Sciences, Rensselaer Polytechnic Institute, Troy, New York 12180 (United States)
2014-02-15
We previously reported experimental and theoretical results on the linear spin-up of a linearly stratified, rotating fluid in a uniform-depth square cylinder [M. R. Foster and R. J. Munro, “The linear spin-up of a stratified, rotating fluid in a square cylinder,” J. Fluid Mech. 712, 7–40 (2012)]. Here we extend that analysis to a “sliced” square cylinder, which has a base-plane inclined at a shallow angle α. Asymptotic results are derived that show the spin-up phase is achieved by a combination of the Ekman-layer eruptions (from the perimeter region of the cylinder's lid and base) and cross-slope-propagating stratified Rossby waves. The final, steady state limit for this spin-up phase is identical to that found previously for the uniform depth cylinder, but is reached somewhat more rapidly on a time scale of order E{sup −1/2}Ω{sup −1}/log (α/E{sup 1/2}) (compared to E{sup −1/2}Ω{sup −1} for the uniform-depth cylinder), where Ω is the rotation rate and E the Ekman number. Experiments were performed for Burger numbers, S, between 0.4 and 16, and showed that for S≳O(1), the Rossby modes are severely damped, and it is only at small S, and during the early stages, that the presence of these wave modes was evident. These observations are supported by the theory, which shows the damping factors increase with S and are numerically large for S≳O(1)
2013-01-01
Background Antibiograms created by aggregating hospital-wide susceptibility data from diverse patients can be misleading. To demonstrate the utility of age- and location-stratified antibiograms, we compared stratified antibiograms for three common bacterial pathogens, E. coli, S. aureus, and S. pneumoniae. We created stratified antibiograms based on patient age (/=65 years), and inpatient or outpatient location using all 2009 E. coli and S. aureus, and all 2008–2009 S. pneumoniae isolates sub...
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Electromagnetic fields due to dipole antennas over stratified anisotropic media.
Kong, J. A.
1972-01-01
Solutions to the problem of radiation of dipole antennas in the presence of a stratified anisotropic media are facilitated by decomposing a general wave field into transverse magnetic (TM) and transverse electric (TE) modes. Employing the propagation matrices, wave amplitudes in any region are related to those in any other regions. The reflection coefficients, which embed all the information about the geometrical configuration and the physical constituents of the medium, are obtained in closed form. In view of the general formulation, various special cases are discussed.
Instabilities developed in stratified flows over pronounced obstacles
Varela, J.; Araújo, M.; Bove, I.; Cabeza, C.; Usera, G.; Martí, Arturo C.; Montagne, R.; Sarasúa, L. G.
2007-12-01
In the present work we study numerical and experimentally the flow of a two-layer stratified fluid over a topographic obstacle. The problem reflects a wide number of oceanographic and meteorological situations, where the stratification plays an important role. We identify the different instabilities developed by studying the pycnocline deformation due to a pronounced obstacle. The numerical simulations were made using the model caffa3D.MB which works with a numerical model of Navier-Stokes equations with finite volume elements in curvilinear meshes. The experimental results are contrasted with numerical simulations. Linear stability analysis predictions are checked with particle image velocimetry (PIV) measurements.
Stratified waveguide grating coupler for normal fiber incidence.
Wang, Bin; Jiang, Jianhua; Chambers, Diana M; Cai, Jingbo; Nordin, Gregory P
2005-04-15
We propose a new stratified waveguide grating coupler (SWGC) to couple light from a fiber at normal incidence into a planar waveguide. SWGCs are designed to operate in the strong coupling regime without intermediate optics between the fiber and the waveguide. Two-dimensional finite-difference time-domain simulation in conjunction with microgenetic algorithm optimization shows that approximately 72% coupling efficiency is possible for fiber (core size of 8.3 microm and delta=0.36%) to slab waveguide (1.2-microm core and delta=3.1%) coupling. We show that the phase-matching and Bragg conditions are simultaneously satisfied through the fundamental leaky mode.
Magnetorotational instability in weakly ionised, stratified accretion discs
Salmeron, Roberto Aureliano; Salmeron, Raquel; Wardle, Mark
2003-01-01
The magnetorotational instability (MRI) (Balbus and Hawley 1991, Hawley and Balbus 1991) transports angular momentum radially outwards in accretion discs through the distortion of the magnetic field lines that connect fluid elements. In protostellar discs, low conductivity is important, especially in the inner regions (Gammie 1996, Wardle 1997). As a result, low k modes are relevant and vertical stratification is a key factor of the analysis. However, most models of the MRI in these environments have adopted either the ambipolar diffusion or resistive approximations and have not simultaneously treated stratification and Hall conductivity. We present here a linear analysis of the MRI, including the Hall effect, in a stratified disc.
Enhanced charge transport kinetics in anisotropic, stratified photoanodes.
Yazdani, Nuri; Bozyigit, Deniz; Utke, Ivo; Buchheim, Jakob; Youn, Seul Ki; Patscheider, Jörg; Wood, Vanessa; Park, Hyung Gyu
2014-02-12
The kinetics of charge transport in mesoporous photoanodes strongly constrains the design and power conversion efficiencies of dye sensitized solar cells (DSSCs). Here, we report a stratified photoanode design with enhanced kinetics achieved through the incorporation of a fast charge transport intermediary between the titania and charge collector. Proof of concept photoanodes demonstrate that the inclusion of the intermediary not only enhances effective diffusion coefficients but also significantly suppresses charge recombination, leading to diffusion lengths two orders of magnitude greater than in standard mesoporous titania photoanodes. The intermediary concept holds promise for higher-efficiency DSSCs.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Probability Ranking in Vector Spaces
Melucci, Massimo
2011-01-01
The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.
Holographic probabilities in eternal inflation.
Bousso, Raphael
2006-11-10
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Local Causality, Probability and Explanation
Healey, Richard A
2016-01-01
In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
A-Stratified Computerized Adaptive Testing with Unequal Item Exposure across Strata.
Deng, Hui; Chang, Hua-Hua
The purpose of this study was to compare a proposed revised a-stratified, or alpha-stratified, USTR method of test item selection with the original alpha-stratified multistage computerized adaptive testing approach (STR) and the use of maximum Fisher information (FSH) with respect to test efficiency and item pool usage using simulated computerized…
Recent progresses in outcome-dependent sampling with failure time data.
Ding, Jieli; Lu, Tsui-Shan; Cai, Jianwen; Zhou, Haibo
2017-01-01
An outcome-dependent sampling (ODS) design is a retrospective sampling scheme where one observes the primary exposure variables with a probability that depends on the observed value of the outcome variable. When the outcome of interest is failure time, the observed data are often censored. By allowing the selection of the supplemental samples depends on whether the event of interest happens or not and oversampling subjects from the most informative regions, ODS design for the time-to-event data can reduce the cost of the study and improve the efficiency. We review recent progresses and advances in research on ODS designs with failure time data. This includes researches on ODS related designs like case-cohort design, generalized case-cohort design, stratified case-cohort design, general failure-time ODS design, length-biased sampling design and interval sampling design.
Diurnal distribution of sunshine probability
Aydinli, S.
1982-01-01
The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.
Probability representation of classical states
Man'ko, OV; Man'ko, [No Value; Pilyavets, OV
2005-01-01
Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
The probabilities of unique events.
Sangeet S Khemlani
Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Three lectures on free probability
2012-01-01
These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.
Penetrative convection in stratified fluids: velocity and temperature measurements
M. Moroni
2006-01-01
Full Text Available The flux through the interface between a mixing layer and a stable layer plays a fundamental role in characterizing and forecasting the quality of water in stratified lakes and in the oceans, and the quality of air in the atmosphere. The evolution of the mixing layer in a stably stratified fluid body is simulated in the laboratory when "Penetrative Convection" occurs. The laboratory model consists of a tank filled with water and subjected to heating from below. The methods employed to detect the mixing layer growth were thermocouples for temperature data and two image analysis techniques, namely Laser Induced Fluorescence (LIF and Feature Tracking (FT. LIF allows the mixing layer evolution to be visualized. Feature Tracking is used to detect tracer particle trajectories moving within the measurement volume. Pollutant dispersion phenomena are naturally described in the Lagrangian approach as the pollutant acts as a tag of the fluid particles. The transilient matrix represents one of the possible tools available for quantifying particle dispersion during the evolution of the phenomenon.
STRESS DISTRIBUTION IN THE STRATIFIED MASS CONTAINING VERTICAL ALVEOLE
Bobileva Tatiana Nikolaevna
2017-08-01
Full Text Available Almost all subsurface rocks used as foundations for various types of structures are stratified. Such heterogeneity may cause specific behaviour of the materials under strain. Differential equations describing the behaviour of such materials contain rapidly fluctuating coefficients, in view of this, solution of such equations is more time-consuming when using today’s computers. The method of asymptotic averaging leads to getting homogeneous medium under study to averaged equations with fixed factors. The present article is concerned with stratified soil mass consisting of pair-wise alternative isotropic elastic layers. In the results of elastic modules averaging, the present soil mass with horizontal rock stratification is simulated by homogeneous transversal-isotropic half-space with isotropy plane perpendicular to the standing axis. Half-space is loosened by a vertical alveole of circular cross-section, and virgin ground is under its own weight. For horizontal parting planes of layers, the following two types of surface conditions are set: ideal contact and backlash without cleavage. For homogeneous transversal-isotropic half-space received with a vertical alveole, the analytical solution of S.G. Lekhnitsky, well known in scientific papers, is used. The author gives expressions for stress components and displacements in soil mass for different marginal conditions on the alveole surface. Such research problems arise when constructing and maintaining buildings and when composite materials are used.
Stability of steam-water countercurrent stratified flow
Lee, S C
1985-10-01
Two flow instabilities which limit the normal condensation processes in countercurrent stratified steam-water flow have been identified experimentally: flooding and condensation-induced waterhammer. In order to initiate condensation-induced waterhammer in nearly horizontal or moderately-inclined steam/subcooled-water flow, two conditions, the appearance of a wavy interface and complete condensation of the incoming steam, are necessary. Analyses of these conditions are performed on a basis of flow stability and heat transfer considerations. Flooding data for several inclinations and channel heights are collected. Effects of condensation, inclination angle and channel height on the flooding characteristics are discussed. An envelope theory for the onset of flooding in inclined stratified flow is developed, which agrees well with the experimental data. Some empirical information on basic flow parameters, such as mean film thickness and interfacial friction factor required for this theory are measured. The previous viewpoints on flooding appear not to conflict with the present experimental data in nearly horizontal flow but the flooding phenomena in nearly vertical flow appear to be more complicated than those described by these viewpoints because of liquid droplet entrainment.
Strongly Stratified Turbulence Wakes and Mixing Produced by Fractal Wakes
Dimitrieva, Natalia; Redondo, Jose Manuel; Chashechkin, Yuli; Fraunie, Philippe; Velascos, David
2017-04-01
This paper describes Shliering and Shadowgraph experiments of the wake induced mixing produced by tranversing a vertical or horizontal fractal grid through the interfase between two miscible fluids at low Atwood and Reynolds numbers. This is a configuration design to models the mixing across isopycnals in stably-stratified flows in many environmental relevant situations (either in the atmosphere or in the ocean. The initial unstable stratification is characterized by a reduced gravity: g' = gΔρ ρ where g is gravity, Δρ being the initial density step and ρ the reference density. Here the Atwood number is A = g' _ 2 g . The topology of the fractal wake within the strong stratification, and the internal wave field produces both a turbulent cascade and a wave cascade, with frecuen parametric resonances, the envelope of the mixing front is found to follow a complex non steady 3rd order polinomial function with a maximum at about 4-5 Brunt-Vaisalla non-dimensional time scales: t/N δ = c1(t/N) + c2g Δρ ρ (t/N)2 -c3(t/N)3. Conductivity probes and Shliering and Shadowgraph visual techniques, including CIV with (Laser induced fluorescence and digitization of the light attenuation across the tank) are used in order to investigate the density gradients and the three-dimensionality of the expanding and contracting wake. Fractal analysis is also used in order to estimate the fastest and slowest growing wavelengths. The large scale structures are observed to increase in wave-length as the mixing progresses, and the processes involved in this increase in scale are also examined.Measurements of the pointwise and horizontally averaged concentrations confirm the picture obtained from past flow visualization studies. They show that the fluid passes through the mixing region with relatively small amounts of molecular mixing,and the molecular effects only dominate on longer time scales when the small scales have penetrated through the large scale structures. The Non
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Correlations and Non-Linear Probability Models
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations betwee...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models.......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...
尚彦龙; 蔡琦; 陈力生; 张杨伟
2012-01-01
本文研究了将响应曲面与重要性抽样相结合的方法用于复杂热力系统参数失效概率的计算.建立了热力系统物理过程参数失效的数学模型,在此基础上研究了将响应曲面与重要性抽样相结合的算法模型,并给出了热力系统组成设备的性能退化模型和基于重要性抽样的仿真流程,进而对反应堆净化系统工作过程中参数失效问题进行了分析计算.研究表明,对于高维、非线性特性明显并考虑性能退化的复杂热力系统参数失效概率的计算,重要性抽样法较直接抽样能以较高效率获得满意精度的计算结果,而响应曲面法存在局限；响应曲面和重要性抽样相结合的方法是分析热力系统物理过程参数失效的有效方法.%In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the draw-backs of response surface method.
Cluster Membership Probability: Polarimetric Approach
Medhi, Biman J
2013-01-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...
Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation
Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin
2016-12-01
If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Hydrodynamics of stratified epithelium: steady state and linearized dynamics
Yeh, Wei-Ting
2015-01-01
A theoretical model for stratified epithelium is presented. The viscoelastic properties of the tissue is assumed to be dependent on the spatial distribution of proliferative and differentiated cells. Based on this assumption, a hydrodynamic description for tissue dynamics at long-wavelength, long-time limit is developed, and the analysis reveals important insight for the dynamics of an epithelium close to its steady state. When the proliferative cells occupy a thin region close to the basal membrane, the relaxation rate towards the steady state is enhanced by cell division and cell apoptosis. On the other hand, when the region where proliferative cells reside becomes sufficiently thick, a flow induced by cell apoptosis close to the apical surface could enhance small perturbations. This destabilizing mechanism is general for continuous self-renewal multi-layered tissues, it could be related to the origin of certain tissue morphology and developing pattern.
Hydrodynamics of stratified epithelium: Steady state and linearized dynamics
Yeh, Wei-Ting; Chen, Hsuan-Yi
2016-05-01
A theoretical model for stratified epithelium is presented. The viscoelastic properties of the tissue are assumed to be dependent on the spatial distribution of proliferative and differentiated cells. Based on this assumption, a hydrodynamic description of tissue dynamics at the long-wavelength, long-time limit is developed, and the analysis reveals important insights into the dynamics of an epithelium close to its steady state. When the proliferative cells occupy a thin region close to the basal membrane, the relaxation rate towards the steady state is enhanced by cell division and cell apoptosis. On the other hand, when the region where proliferative cells reside becomes sufficiently thick, a flow induced by cell apoptosis close to the apical surface enhances small perturbations. This destabilizing mechanism is general for continuous self-renewal multilayered tissues; it could be related to the origin of certain tissue morphology, tumor growth, and the development pattern.
Local Radiation MHD Instabilities in Magnetically Stratified Media
Tao, Ted
2011-01-01
We study local radiation magnetohydrodynamic instabilities in static, optically thick, vertically stratified media with constant flux mean opacity. We include the effects of vertical gradients in a horizontal background magnetic field. Assuming rapid radiative diffusion, we use the zero gas pressure limit as an entry point for investigating the coupling between the photon bubble instability and the Parker instability. Apart from factors that depend on wavenumber orientation, the Parker instability exists for wavelengths longer than a characteristic wavelength lambda_{tran}, while photon bubbles exist for wavelengths shorter than lambda_{tran}. The growth rate in the Parker regime is independent of the orientation of the horizontal component of the wavenumber when radiative diffusion is rapid, but the range of Parker-like wavenumbers is extended if there exists strong horizontal shear between field lines (i.e. horizontal wavenumber perpendicular to the magnetic field). Finite gas pressure introduces an additio...
Short-wave vortex instability in stratified flow
Bovard, Luke
2014-01-01
In this paper we investigate a new instability of the Lamb-Chaplygin dipole in a stratified fluid. Through numerical linear stability analysis, a secondary peak in the growth rate emerges at vertical scales about an order of magnitude smaller than the buoyancy scale $L_{b}=U/N$ where $U$ is the characteristic velocity and $N$ is the Brunt-V\\"{a}is\\"{a}l\\"{a} frequency. This new instability exhibits a growth rate that is similar to, and even exceeds, that of the zigzag instability, which has the characteristic length of the buoyancy scale. This instability is investigated for a wide range of Reynolds $Re=2000-20000$ and horizontal Froude numbers $F_{h}=0.05-0.2$, where $F_{h}=U/NR$, $Re=UR/\
Internal combustion engine using premixed combustion of stratified charges
Marriott, Craig D.; Reitz, Rolf D. (Madison, WI
2003-12-30
During a combustion cycle, a first stoichiometrically lean fuel charge is injected well prior to top dead center, preferably during the intake stroke. This first fuel charge is substantially mixed with the combustion chamber air during subsequent motion of the piston towards top dead center. A subsequent fuel charge is then injected prior to top dead center to create a stratified, locally richer mixture (but still leaner than stoichiometric) within the combustion chamber. The locally rich region within the combustion chamber has sufficient fuel density to autoignite, and its self-ignition serves to activate ignition for the lean mixture existing within the remainder of the combustion chamber. Because the mixture within the combustion chamber is overall premixed and relatively lean, NO.sub.x and soot production are significantly diminished.
A study of stratified gas-liquid pipe flow
Johnson, George W.
2005-07-01
This work includes both theoretical modelling and experimental observations which are relevant to the design of gas condensate transport lines. Multicomponent hydrocarbon gas mixtures are transported in pipes over long distances and at various inclinations. Under certain circumstances, the heavier hydrocarbon components and/or water vapour condense to form one or more liquid phases. Near the desired capacity, the liquid condensate and water is efficiently transported in the form of a stratified flow with a droplet field. During operating conditions however, the flow rate may be reduced allowing liquid accumulation which can create serious operational problems due to large amounts of excess liquid being expelled into the receiving facilities during production ramp-up or even in steady production in severe cases. In particular, liquid tends to accumulate in upward inclined sections due to insufficient drag on the liquid from the gas. To optimize the transport of gas condensates, a pipe diameters should be carefully chosen to account for varying flow rates and pressure levels which are determined through the knowledge of the multiphase flow present. It is desirable to have a reliable numerical simulation tool to predict liquid accumulation for various flow rates, pipe diameters and pressure levels which is not presently accounted for by industrial flow codes. A critical feature of the simulation code would include the ability to predict the transition from small liquid accumulation at high flow rates to large liquid accumulation at low flow rates. A semi-intermittent flow regime of roll waves alternating with a partly backward flowing liquid film has been observed experimentally to occur for a range of gas flow rates. Most of the liquid is transported in the roll waves. The roll wave regime is not well understood and requires fundamental modelling and experimental research. The lack of reliable models for this regime leads to inaccurate prediction of the onset of
Turbulent reconnection of magnetic bipoles in stratified turbulence
Jabbari, Sarah; Mitra, Dhrubaditya; Kleeorin, Nathan; Rogachevskii, Igor
2016-01-01
We consider strongly stratified forced turbulence in a plane-parallel layer with helicity and corresponding large-scale dynamo action in the lower part and nonhelical turbulence in the upper. The magnetic field is found to develop strongly concentrated bipolar structures near the surface. They form elongated bands with a sharp interface between opposite polarities. Unlike earlier experiments with imposed magnetic field, the inclusion of rotation does not strongly suppress the formation of these structures. We perform a systematic numerical study of this phenomenon by varying magnetic Reynolds number, scale separation ratio, and Coriolis number. We also focus on the formation of the current sheet between bipolar regions where reconnection of oppositely oriented field lines occurs. We determine the reconnection rate by measuring either the inflow velocity in the vicinity of the current sheet or by measuring the electric field in the reconnection region. We demonstrate that for small Lundquist number, S1000, the...
Direct simulation of the stably stratified turbulent Ekman layer
Coleman, G. N.; Ferziger, J. H.; Spalart, P. R.
1992-01-01
The Navier-Stokes equations and the Boussinesq approximation were used to compute a 3D time-dependent turbulent flow in the stably stratified Ekman layer over a smooth surface. The simulation data are found to be in very good agreement with atmospheric measurements when nondimensionalized according to Nieuwstadt's local scaling scheme. Results suggest that, when Reynolds number effects are taken into account, the 'constant Froud number' stable layer model (Brost and Wyngaard, 1978) and the 'shearing length' stable layer model (Hunt, 1985) for the dissipitation rate of turbulent kinetic energy are both valid. It is concluded that there is good agreement between the direct numerical simulation results and large-eddy simulation results obtained by Mason and Derbyshire (1990).
Inertial modes of non-stratified superfluid neutron stars
Prix, R; Andersson, N
2004-01-01
We present results concerning adiabatic inertial-mode oscillations of non-stratified superfluid neutron stars in Newtonian gravity, using the anelastic and slow-rotation approximations. We consider a simple two-fluid model of a superfluid neutron star, where one fluid consists of the superfluid neutrons and the second fluid contains all the comoving constituents (protons, electrons). The two fluids are assumed to be ``free'' in the sense that vortex-mediated forces like mutual friction or pinning are absent, but they can be coupled by the equation of state, in particular by entrainment. The stationary background consists of the two fluids rotating uniformly around the same axis with potentially different rotation rates. We study the special cases of co-rotating backgrounds, vanishing entrainment, and the purely toroidal r-modes, analytically. We calculate numerically the eigenfunctions and frequencies of inertial modes in the general case of non co-rotating backgrounds, and study their dependence on the relat...
Magnetorotational instability in stratified, weakly ionised accretion discs
Salmeron, Roberto Aureliano; Salmeron, Raquel; Wardle, Mark
2003-01-01
We present a linear analysis of the vertical structure and growth of the magnetorotational instability in stratified, weakly ionised accretion discs, such as protostellar and quiescent dwarf novae systems. The method includes the effects of the magnetic coupling, the conductivity regime of the fluid and the strength of the magnetic field, which is initially vertical. The conductivity is treated as a tensor and assumed constant with height. We obtained solutions for the structure and growth rate of global unstable modes for different conductivity regimes, strengths of the initial magnetic field and coupling between ionised and neutral components of the fluid. The envelopes of short-wavelenght perturbations are determined by the action of competing local growth rates at different heights, driven by the vertical stratification of the disc. Ambipolar diffusion perturbations peak consistently higher above the midplane than modes including Hall conductivity. For weak coupling, perturbations including the Hall effec...
Second order closure for stratified convection: bulk region and overshooting
Biferale, L; Sbragaglia, M; Scagliarini, A; Toschi, F; Tripiccione, R
2011-01-01
The parameterization of small-scale turbulent fluctuations in convective systems and in the presence of strong stratification is a key issue for many applied problems in oceanography, atmospheric science and planetology. In the presence of stratification, one needs to cope with bulk turbulent fluctuations and with inversion regions, where temperature, density -or both- develop highly non-linear mean profiles due to the interactions between the turbulent boundary layer and the unmixed -stable- flow above/below it. We present a second order closure able to cope simultaneously with both bulk and boundary layer regions, and we test it against high-resolution state-of-the-art 2D numerical simulations in a convective and stratified belt for values of the Rayleigh number, up to Ra = 10^9. Data are taken from a Rayleigh-Taylor system confined by the existence of an adiabatic gradient.
Geospatial techniques for developing a sampling frame of watersheds across a region
Gresswell, Robert E.; Bateman, Doug; Lienkaemper, George; Guy, T.J.
2004-01-01
Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.
Oxygenation of Stratified Reservoir Using Air Bubble Plume
Schladow, S. G.
2006-12-01
Excess nutrients loading from urban area and watershed into lakes and reservoirs increases the content of organic matter, which, through decomposition, needs increased dissolve oxygen (DO). Many eutrophic reservoirs and lakes cannot meet the DO requirement during stratified season and suffers from the hypolimnetic anoxia. As a result, benthic sediment produces anoxic products such as methane, hydrogen sulphide, ammonia, iron, manganese, and phosphorus. In order to address the hypolimnetic anoxia, oxygen is artificially supplied into reservoir using an aeration system (i.e., bubbler). The most common result of lake/reservoir aeration is to destratify the reservoir so that the water body may completely mix under natural phenomena and remain well oxygenated throughout. Other advantages of destratification are: (1) allows warm- water fish to inhabit the entire reservoir, (2) suppress the nutrient release from sediment, and (3) decreases the algal growth by sending them to the darker zone. A one-dimensional reservoir-bubbler model is developed and applied to examine the effects of an aeration system on mixing and dissolved oxygen dynamics in the Upper Peirce Reservoir, Singapore. After introduction of the aeration system in the reservoir, it was found that the hypolimnetic DO increased significantly, and the concentration of algae, soluble manganese and iron substantially reduced. It is found that the reservoir-bubbler model predicts the mixing (temperature as mixing parameter) and dissolved oxygen concentration in the reservoir with acceptable accuracy. It is shown in terms of bubbler mechanical efficiency (i.e., operating cost) and total DO contribution from the aeration system into the reservoir that the selections of airflow rate per diffuser, air bubble radius, and total number of diffusers are important design criteria of a bubbler system. However, the overall bubbler design also depends on the reservoir size and stratified area of interest, ambient climate, and
Nonlinear gravity-wave interactions in stratified turbulence
Remmel, Mark; Sukhatme, Jai; Smith, Leslie M.
2014-04-01
To investigate the dynamics of gravity waves in stratified Boussinesq flows, a model is derived that consists of all three-gravity-wave-mode interactions (the GGG model), excluding interactions involving the vortical mode. The GGG model is a natural extension of weak turbulence theory that accounts for exact three-gravity-wave resonances. The model is examined numerically by means of random, large-scale, high-frequency forcing. An immediate observation is a robust growth of the so-called vertically sheared horizontal flow (VSHF). In addition, there is a forward transfer of energy and equilibration of the nonzero-frequency (sometimes called "fast") gravity-wave modes. These results show that gravity-wave-mode interactions by themselves are capable of systematic interscale energy transfer in a stratified fluid. Comparing numerical simulations of the GGG model and the full Boussinesq system, for the range of Froude numbers ( Fr) considered (0.05 ≤ Fr ≤ 1), in both systems the VSHF is hardest to resolve. When adequately resolved, VSHF growth is more vigorous in the GGG model. Furthermore, a VSHF is observed to form in milder stratification scenarios in the GGG model than the full Boussinesq system. Finally, fully three-dimensional nonzero-frequency gravity-wave modes equilibrate in both systems and their scaling with vertical wavenumber follows similar power-laws. The slopes of the power-laws obtained depend on Fr and approach -2 (from above) at Fr = 0.05, which is the strongest stratification that can be properly resolved with our computational resources.
Visualization periodic flows in a continuously stratified fluid.
Bardakov, R.; Vasiliev, A.
2012-04-01
To visualize the flow pattern of viscous continuously stratified fluid both experimental and computational methods were developed. Computational procedures were based on exact solutions of set of the fundamental equations. Solutions of the problems of flows producing by periodically oscillating disk (linear and torsion oscillations) were visualized with a high resolutions to distinguish small-scale the singular components on the background of strong internal waves. Numerical algorithm of visualization allows to represent both the scalar and vector fields, such as velocity, density, pressure, vorticity, stream function. The size of the source, buoyancy and oscillation frequency, kinematic viscosity of the medium effects were traced in 2D an 3D posing problems. Precision schlieren instrument was used to visualize the flow pattern produced by linear and torsion oscillations of strip and disk in a continuously stratified fluid. Uniform stratification was created by the continuous displacement method. The buoyancy period ranged from 7.5 to 14 s. In the experiments disks with diameters from 9 to 30 cm and a thickness of 1 mm to 10 mm were used. Different schlieren methods that are conventional vertical slit - Foucault knife, vertical slit - filament (Maksoutov's method) and horizontal slit - horizontal grating (natural "rainbow" schlieren method) help to produce supplementing flow patterns. Both internal wave beams and fine flow components were visualized in vicinity and far from the source. Intensity of high gradient envelopes increased proportionally the amplitude of the source. In domains of envelopes convergence isolated small scale vortices and extended mushroom like jets were formed. Experiments have shown that in the case of torsion oscillations pattern of currents is more complicated than in case of forced linear oscillations. Comparison with known theoretical model shows that nonlinear interactions between the regular and singular flow components must be taken
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Innovation and social probable knowledge
Marco Crocco
2000-01-01
In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...
Knowledge typology for imprecise probabilities.
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Maher, Nicole; Muir, Tracey
2014-01-01
This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…
Generating pseudo-random discrete probability distributions
Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica
2015-08-15
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)
Probability and statistics for particle physics
Mana, Carlos
2017-01-01
This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...
PROBABILITY INEQUALITIES FOR SUMS OF INDEPENDENT UNBOUNDED RANDOM VARIABLES
张涤新; 王志诚
2001-01-01
The tail probability inequalities for the sum of independent unbounded random variables on a probability space ( Ω , T, P) were studied and a new method was proposed to treat the sum of independent unbounded random variables by truncating the original probability space (Ω, T, P ). The probability exponential inequalities for sums of independent unbounded random variables were given. As applications of the results, some interesting examples were given. The examples show that the method proposed in the paper and the results of the paper are quite useful in the study of the large sample properties of the sums of independent unbounded random variables.
Evaluation of sampling strategies to estimate crown biomass
Krishna P Poudel
2015-01-01
Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of
Ritter, Lois A., Ed.; Sue, Valerie M., Ed.
2007-01-01
This chapter provides an overview of sampling methods that are appropriate for conducting online surveys. The authors review some of the basic concepts relevant to online survey sampling, present some probability and nonprobability techniques for selecting a sample, and briefly discuss sample size determination and nonresponse bias. Although some…
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Fuzzy Markov chains: uncertain probabilities
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
DECOFF Probabilities of Failed Operations
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability representations of fuzzy systems
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
Mo Isotopes Record Destabilization of a Stratified Ocean at the Precambrian-Cambrian Boundary
Wille, M.; Nägler, T. F.; Schröder, S.; Lehmann, B.; Kramers, J. D.
2007-12-01
Here we present Mo isotope signatures in black shales from two sample sets (Ara group, Oman and Yangtze Platform, China) which were deposited at and shortly after the Precambrian-Cambrian boundary (PC-C). At the first view, the overall Mo isotopic signatures (delta98/95Mo) of the Early Cambrian black shales is 1.2 permil below recent ocean water, similar to the signature found in Mesoproterozoic shales (Arnold et al. 2004), indicating a larger proportion of Mo sedimentation under strongly euxinic conditions compared to recent oceans. A chemically stratified ocean with sulfidic deep waters and modestly oxygenated surface waters as proposed by Canfield (1998) for the Paleoproterozoic and Mesoproterozoic ocean, and Jiang et al. (2007) reported Carbon isotope data from the Ediacaran Yangtze platform (635-542 Ma) to be consistent with long-term deep ocean anoxia/euxinia. A stratified ocean therefore provides a plausible scenario to explain our new PC-C Mo isotope data. On closer inspection, a transient Mo isotopic signal following immediately after the PC-C boundary in both sample sets indicates a short but intense global non-steady state situation. In particular, a short term, drastic decrease of the Mo ocean inventory to almost zero is required to reconcile the observed Mo isotope data. Combined with the extreme Mo enrichment, found in the Chinese sulfide marker bed at the PC-C boundary, this signal has to be explained with a non-uniformitarian Mo scavenging mechanism. We put forward the hypothesis of mixing of oxidized, i.e. Mo rich surface waters with upwelling euxinic bottom water masses of the stratified ocean, as H2S is the most efficient Mo scavenging reagent. This scenario not only explains the transient isotopic signal, it can also be responsible for the sudden extinction of the Ediacaran fauna by H2S poisoning. In contrast, mass extinction scenarios like bolide impact, flood basalt eruptions or methane release, do not provide a direct explanation for the
Prevalence of masturbation and associated factors in a British national probability survey.
Gerressu, Makeda; Mercer, Catherine H; Graham, Cynthia A; Wellings, Kaye; Johnson, Anne M
2008-04-01
A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with reporting this behavior. Seventy-three percent of men and 36.8% of women reported masturbating in the 4 weeks prior to interview (95% confidence interval 71.5%-74.4% and 35.4%-38.2%, respectively). A number of sociodemographic and behavioral factors were associated with reporting masturbation. Among both men and women, reporting masturbation increased with higher levels of education and social class and was more common among those reporting sexual function problems. For women, masturbation was more likely among those who reported more frequent vaginal sex in the last four weeks, a greater repertoire of sexual activity (such as reporting oral and anal sex), and more sexual partners in the last year. In contrast, the prevalence of masturbation was lower among men reporting more frequent vaginal sex. Both men and women reporting same-sex partner(s) were significantly more likely to report masturbation. Masturbation is a common sexual practice with significant variations in reporting between men and women.
Extent of atypical hyperplasia stratifies breast cancer risk in 2 independent cohorts of women.
Degnim, Amy C; Dupont, William D; Radisky, Derek C; Vierkant, Robert A; Frank, Ryan D; Frost, Marlene H; Winham, Stacey J; Sanders, Melinda E; Smith, Jeffrey R; Page, David L; Hoskin, Tanya L; Vachon, Celine M; Ghosh, Karthik; Hieken, Tina J; Denison, Lori A; Carter, Jodi M; Hartmann, Lynn C; Visscher, Daniel W
2016-10-01
Women with atypical hyperplasia (AH) on breast biopsy have a substantially increased risk of breast cancer (BC). Here the BC risk for the extent and subtype of AH is reported for 2 separate cohorts. All samples containing AH were included from 2 cohorts of women with benign breast disease (Mayo Clinic and Nashville). Histology review quantified the number of foci of atypical ductal hyperplasia (ADH) and atypical lobular hyperplasia (ALH). The BC risk was stratified for the number of AH foci within AH subtypes. The study included 708 Mayo AH subjects and 466 Nashville AH subjects. In the Mayo cohort, an increasing number of foci of AH was associated with a significant increase in the risk of BC both for ADH (relative risks of 2.61, 5.21, and 6.36 for 1, 2, and ≥3 foci, respectively; P for linear trend = .006) and for ALH (relative risks of 2.56, 3.50, and 6.79 for 1, 2, and ≥3 foci, respectively; P for linear trend = .001). In the Nashville cohort, the relative risks of BC for ADH were 2.70, 5.17, and 15.06 for 1, 2, and ≥3 foci, respectively (P for linear trend < .001); for ALH, the relative risks also increased but not significantly (2.61, 3.48, and 4.02, respectively; P = .148). When the Mayo and Nashville samples were combined, the risk increased significantly for 1, 2, and ≥3 foci: the relative risks were 2.65, 5.19, and 8.94, respectively, for ADH (P < .001) and 2.58, 3.49, and 4.97, respectively, for ALH (P = .001). In 2 independent cohort studies of benign breast disease, the extent of atypia stratified the long-term BC risk for ADH and ALH. Cancer 2016;122:2971-2978. © 2016 American Cancer Society. © 2016 American Cancer Society.
Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M
2017-06-23
In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of
Random iteration with place dependent probabilities
Kapica, R
2011-01-01
Markov chains arising from random iteration of functions $S_{\\theta}:X\\to X$, $\\theta \\in \\Theta$, where $X$ is a Polish space and $\\Theta$ is arbitrary set of indices are considerd. At $x\\in X$, $\\theta$ is sampled from distribution $\\theta_x$ on $\\Theta$ and $\\theta_x$ are different for different $x$. Exponential convergence to a unique invariant measure is proved. This result is applied to case of random affine transformations on ${\\mathbb R}^d$ giving existence of exponentially attractive perpetuities with place dependent probabilities.
Stratified flows with variable density: mathematical modelling and numerical challenges.
Murillo, Javier; Navas-Montilla, Adrian
2017-04-01
Stratified flows appear in a wide variety of fundamental problems in hydrological and geophysical sciences. They may involve from hyperconcentrated floods carrying sediment causing collapse, landslides and debris flows, to suspended material in turbidity currents where turbulence is a key process. Also, in stratified flows variable horizontal density is present. Depending on the case, density varies according to the volumetric concentration of different components or species that can represent transported or suspended materials or soluble substances. Multilayer approaches based on the shallow water equations provide suitable models but are not free from difficulties when moving to the numerical resolution of the governing equations. Considering the variety of temporal and spatial scales, transfer of mass and energy among layers may strongly differ from one case to another. As a consequence, in order to provide accurate solutions, very high order methods of proved quality are demanded. Under these complex scenarios it is necessary to observe that the numerical solution provides the expected order of accuracy but also converges to the physically based solution, which is not an easy task. To this purpose, this work will focus in the use of Energy balanced augmented solvers, in particular, the Augmented Roe Flux ADER scheme. References: J. Murillo , P. García-Navarro, Wave Riemann description of friction terms in unsteady shallow flows: Application to water and mud/debris floods. J. Comput. Phys. 231 (2012) 1963-2001. J. Murillo B. Latorre, P. García-Navarro. A Riemann solver for unsteady computation of 2D shallow flows with variable density. J. Comput. Phys.231 (2012) 4775-4807. A. Navas-Montilla, J. Murillo, Energy balanced numerical schemes with very high order. The Augmented Roe Flux ADER scheme. Application to the shallow water equations, J. Comput. Phys. 290 (2015) 188-218. A. Navas-Montilla, J. Murillo, Asymptotically and exactly energy balanced augmented flux
Deep silicon maxima in the stratified oligotrophic Mediterranean Sea
Y. Crombet
2011-02-01
Full Text Available The silicon biogeochemical cycle has been studied in the Mediterranean Sea during late summer/early autumn 1999 and summer 2008. The distribution of nutrients, particulate carbon and silicon, fucoxanthin (Fuco, and total chlorophyll-a (TChl-a were investigated along an eastward gradient of oligotrophy during two cruises (PROSOPE and BOUM encompassing the entire Mediterranean Sea during the stratified period. At both seasons, surface waters were depleted in nutrients and the nutriclines gradually deepened towards the East, the phosphacline being the deepest in the easternmost Levantine basin. Following the nutriclines, parallel deep maxima of biogenic silica (DSM, fucoxanthin (DFM and TChl-a (DCM were evidenced during both seasons with maximal concentrations of 0.45 μmol L^{−1} for BSi, 0.26 μg L^{−1} for Fuco, and 1.70 μg L^{−1} for TChl-a, all measured during summer. Contrary to the DCM which was a persistent feature in the Mediterranean Sea, the DSM and DFMs were observed in discrete areas of the Alboran Sea, the Algero-Provencal basin, the Ionian sea and the Levantine basin, indicating that diatoms were able to grow at depth and dominate the DCM under specific conditions. Diatom assemblages were dominated by Chaetoceros spp., Leptocylindrus spp., Pseudonitzschia spp. and the association between large centric diatoms (Hemiaulus hauckii and Rhizosolenia styliformis and the cyanobacterium Richelia intracellularis was observed at nearly all sites. The diatom's ability to grow at depth is commonly observed in other oligotrophic regions and could play a major role in ecosystem productivity and carbon export to depth. Contrary to the common view that Si and siliceous phytoplankton are not major components of the Mediterranean biogeochemistry, we suggest here that diatoms, by persisting at depth during the stratified period, could contribute to a
Fishing and the oceanography of a stratified shelf sea
Sharples, Jonathan; Ellis, Jim R.; Nolan, Glenn; Scott, Beth E.
2013-10-01
Fishing vessel position data from the Vessel Monitoring System (VMS) were used to investigate fishing activity in the Celtic Sea, a seasonally-stratifying, temperate region on the shelf of northwest Europe. The spatial pattern of fishing showed that three main areas are targeted: (1) the Celtic Deep (an area of deeper water with fine sediments), (2) the shelf edge, and (3) an area covering several large seabed banks in the central Celtic Sea. Data from each of these regions were analysed to examine the contrasting seasonality of fishing activity, and to highlight where the spring-neap tidal cycle appears to be important to fishing. The oceanographic characteristics of the Celtic Sea were considered alongside the distribution and timing of fishing, illustrating likely contrasts in the underlying environmental drivers of the different fished regions. In the central Celtic Sea, fishing mainly occurred during the stratified period between April and August. Based on evidence provided in other papers of this Special Issue, we suggest that the fishing in this area is supported by (1) a broad increase in primary production caused by lee-waves generated by seabed banks around spring tides driving large supplies of nutrients into the photic zone, and (2) greater concentrations of zooplankton within the region influenced by the seabed banks and elevated primary production. In contrast, while the shelf edge is a site of elevated surface chlorophyll, previous work has suggested that the periodic mixing generated by an internal tide at the shelf edge alters the size-structure of the phytoplankton community which fish larvae from the spawning stocks along the shelf edge are able to exploit. The fishery for Nephrops norvegicus in the Celtic Deep was the only one to show a significant spring-neap cycle, possibly linked to Nephrops foraging outside their burrows less during spring tides. More tentatively, the fishery for Nephrops correlated most strongly with a localised shift in
Probability biases as Bayesian inference
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Cluster pre-existence probability
Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)
2011-10-15
Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Knot probabilities in random diagrams
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
Biomarker evidence for green and purple sulphur bacteria in a stratified Palaeoproterozoic sea.
Brocks, Jochen J; Love, Gordon D; Summons, Roger E; Knoll, Andrew H; Logan, Graham A; Bowden, Stephen A
2005-10-06
The disappearance of iron formations from the geological record approximately 1.8 billion years (Gyr) ago was the consequence of rising oxygen levels in the atmosphere starting 2.45-2.32 Gyr ago. It marks the end of a 2.5-Gyr period dominated by anoxic and iron-rich deep oceans. However, despite rising oxygen levels and a concomitant increase in marine sulphate concentration, related to enhanced sulphide oxidation during continental weathering, the chemistry of the oceans in the following mid-Proterozoic interval (approximately 1.8-0.8 Gyr ago) probably did not yet resemble our oxygen-rich modern oceans. Recent data indicate that marine oxygen and sulphate concentrations may have remained well below current levels during this period, with one model indicating that anoxic and sulphidic marine basins were widespread, and perhaps even globally distributed. Here we present hydrocarbon biomarkers (molecular fossils) from a 1.64-Gyr-old basin in northern Australia, revealing the ecological structure of mid-Proterozoic marine communities. The biomarkers signify a marine basin with anoxic, sulphidic, sulphate-poor and permanently stratified deep waters, hostile to eukaryotic algae. Phototrophic purple sulphur bacteria (Chromatiaceae) were detected in the geological record based on the new carotenoid biomarker okenane, and they seem to have co-existed with communities of green sulphur bacteria (Chlorobiaceae). Collectively, the biomarkers support mounting evidence for a long-lasting Proterozoic world in which oxygen levels remained well below modern levels.
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Logic, Probability, and Human Reasoning
2015-01-01
3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ
Probability and statistics: A reminder
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Objective probability and quantum fuzziness
Mohrhoff, U
2007-01-01
This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...
卓纮畾; 杨超杰; 孔英
2013-01-01
The market believes the combination of fundamental analysis and technical analysis strategy can ef-fectively improve theaccuracy of the analysis , and pastresearch of the efficiency of strategyportfolio is relativelyva-cant.This paper is base on noise theory and the positive feedback theory ,and make a use of probability theory to proof that a combination of strategies in a large sample properties difficult to improve the analysis accuracy effective -ly.The research also indicate that the probability of improvement of thecombinationodds isnotamonotonic functiono -fsinglestrategy`sodds.That is , the strategy portfolio ’ s odds may not be effectively improve by the single strategy which with higher odds.%市场认为将基本面分析和技术分析策略进行组合能有效提高分析准确度，而以往对于策略间组合的效率研究相对空缺。以技术分析流派噪音理论以及行为金融正反馈模型为理论基础，结合概率论的推导证明，在大样本性质下策略的组合难以有效提高分析准确度，且认为组合胜算得到提高的概率并非所使用策略胜算的单调函数，即胜算越高的策略进行组合，其产生的策略组合未必能有效提高分析准确率。
Empirical and Computational Tsunami Probability
Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.
2008-12-01
A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical
Turbulence comes in bursts in stably stratified flows
Rorai, C; Pouquet, A
2013-01-01
There is a clear distinction between simple laminar and complex turbulent fluids. But in some cases, as for the nocturnal planetary boundary layer, a stable and well-ordered flow can develop intense and sporadic bursts of turbulent activity which disappear slowly in time. This phenomenon is ill-understood and poorly modeled; and yet, it is central to our understanding of weather and climate dynamics. We present here a simple model which shows that in stably stratified turbulence, the stronger bursts can occur when the flow is expected to be more stable. The bursts are generated by a rapid non-linear amplification of energy stored in waves, and are associated with energetic interchanges between vertical velocity and temperature (or density) fluctuations. Direct numerical simulations on grids of 2048^3 points confirm this somewhat paradoxical result of measurably stronger events for more stable flows, displayed not only in the temperature and vertical velocity derivatives, but also in the amplitude of the field...
DNS of stratified spatially-developing turbulent thermal boundary layers
Araya, Guillermo; Castillo, Luciano; Jansen, Kenneth
2012-11-01
Direct numerical simulations (DNS) of spatially-developing turbulent thermal boundary layers under stratification are performed. It is well known that the transport phenomena of the flow is significantly affected by buoyancy, particularly in urban environments where stable and unstable atmospheric boundary layers are encountered. In the present investigation, the Dynamic Multi-scale approach by Araya et al. (JFM, 670, 2011) for turbulent inflow generation is extended to thermally stratified boundary layers. Furthermore, the proposed Dynamic Multi-scale approach is based on the original rescaling-recycling method by Lund et al. (1998). The two major improvements are: (i) the utilization of two different scaling laws in the inner and outer parts of the boundary layer to better absorb external conditions such as inlet Reynolds numbers, streamwise pressure gradients, buoyancy effects, etc., (ii) the implementation of a Dynamic approach to compute scaling parameters from the flow solution without the need of empirical correlations as in Lund et al. (1998). Numerical results are shown for ZPG flows at high momentum thickness Reynolds numbers (~ 3,000) and a comparison with experimental data is also carried out.
Stratified patterns of divorce: Earnings, education, and gender
Amit Kaplan
2015-05-01
Full Text Available Background: Despite evidence that divorce has become more prevalent among weaker socioeconomic groups, knowledge about the stratification aspects of divorce in Israel is lacking. Moreover, although scholarly debate recognizes the importance of stratificational positions with respect to divorce, less attention has been given to the interactions between them. Objective: Our aim is to examine the relationship between social inequality and divorce, focusing on how household income, education, employment stability, relative earnings, and the intersection between them affect the risk of divorce in Israel. Methods: The data is derived from combined census files for 1995-2008, annual administrative employment records from the National Insurance Institute and the Tax Authority, and data from the Civil Registry of Divorce. We used a series of discrete-time event-history analysis models for marital dissolution. Results: Couples in lower socioeconomic positions had a higher risk of divorce in Israel. Higher education in general, and homogamy in terms of higher education (both spouses have degrees in particular, decreased the risk of divorce. The wife's relative earnings had a differential effect on the likelihood of divorce, depending on household income: a wife who outearned her husband increased the log odds of divorce more in the upper tertiles than in the lower tertile. Conclusions: Our study shows that divorce indeed has a stratified pattern and that weaker socioeconomic groups experience the highest levels of divorce. Gender inequality within couples intersects with the household's economic and educational resources.
Self-Knowledge and Risk in Stratified Medicine.
Hordern, Joshua
2017-04-01
This article considers why and how self-knowledge is important to communication about risk and behaviour change by arguing for four claims. First, it is doubtful that genetic knowledge should properly be called 'self-knowledge' when its ordinary effects on self-motivation and behaviour change seem so slight. Second, temptations towards a reductionist, fatalist, construal of persons' futures through a 'molecular optic' should be resisted. Third, any plausible effort to change people's behaviour must engage with cultural self-knowledge, values and beliefs, catalysed by the communication of genetic risk. For example, while a Judaeo-Christian notion of self-knowledge is distinctively theological, people's self-knowledge is plural in its insight and sources. Fourth, self-knowledge is found in compassionate, if tense, communion which yields freedom from determinism even amidst suffering. Stratified medicine thus offers a newly precise kind of humanising health care through societal solidarity with the riskiest. However, stratification may also mean that molecularly unstratified, 'B' patients' experience involves accentuated suffering and disappointment, a concern requiring further research.
Stratified Flow Past a Hill: Dividing Streamline Concept Revisited
Leo, Laura S.; Thompson, Michael Y.; Di Sabatino, Silvana; Fernando, Harindra J. S.
2016-06-01
The Sheppard formula (Q J R Meteorol Soc 82:528-529, 1956) for the dividing streamline height H_s assumes a uniform velocity U_∞ and a constant buoyancy frequency N for the approach flow towards a mountain of height h, and takes the form H_s/h=( {1-F} ) , where F=U_{∞}/Nh. We extend this solution to a logarithmic approach-velocity profile with constant N. An analytical solution is obtained for H_s/h in terms of Lambert-W functions, which also suggests alternative scaling for H_s/h. A `modified' logarithmic velocity profile is proposed for stably stratified atmospheric boundary-layer flows. A field experiment designed to observe H_s is described, which utilized instrumentation from the spring field campaign of the Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program. Multiple releases of smoke at F≈ 0.3-0.4 support the new formulation, notwithstanding the limited success of experiments due to logistical constraints. No dividing streamline is discerned for F≈ 10, since, if present, it is too close to the foothill. Flow separation and vortex shedding is observed in this case. The proposed modified logarithmic profile is in reasonable agreement with experimental observations.
Large eddy simulation of unsteady lean stratified premixed combustion
Duwig, C. [Division of Fluid Mechanics, Department of Energy Sciences, Lund University, SE 221 00 Lund (Sweden); Fureby, C. [Division of Weapons and Protection, Warheads and Propulsion, The Swedish Defense Research Agency, FOI, SE 147 25 Tumba (Sweden)
2007-10-15
Premixed turbulent flame-based technologies are rapidly growing in importance, with applications to modern clean combustion devices for both power generation and aeropropulsion. However, the gain in decreasing harmful emissions might be canceled by rising combustion instabilities. Unwanted unsteady flame phenomena that might even destroy the whole device have been widely reported and are subject to intensive studies. In the present paper, we use unsteady numerical tools for simulating an unsteady and well-documented flame. Computations were performed for nonreacting, perfectly premixed and stratified premixed cases using two different numerical codes and different large-eddy-simulation-based flamelet models. Nonreacting simulations are shown to agree well with experimental data, with the LES results capturing the mean features (symmetry breaking) as well as the fluctuation level of the turbulent flow. For reacting cases, the uncertainty induced by the time-averaging technique limited the comparisons. Given an estimate of the uncertainty, the numerical results were found to reproduce well the experimental data in terms both of mean flow field and of fluctuation levels. In addition, it was found that despite relying on different assumptions/simplifications, both numerical tools lead to similar predictions, giving confidence in the results. Moreover, we studied the flame dynamics and particularly the response to a periodic pulsation. We found that above a certain excitation level, the flame dynamic changes and becomes rather insensitive to the excitation/instability amplitude. Conclusions regarding the self-growth of thermoacoustic waves were drawn. (author)
Economic evaluation in stratified medicine: methodological issues and challenges
Hans-Joerg eFugel
2016-05-01
Full Text Available Background: Stratified Medicine (SM is becoming a practical reality with the targeting of medicines by using a biomarker or genetic-based diagnostic to identify the eligible patient sub-population. Like any healthcare intervention, SM interventions have costs and consequences that must be considered by reimbursement authorities with limited resources. Methodological standards and guidelines exist for economic evaluations in clinical pharmacology and are an important component for health technology assessments (HTAs in many countries. However, these guidelines have initially been developed for traditional pharmaceuticals and not for complex interventions with multiple components. This raises the issue as to whether these guidelines are adequate to SM interventions or whether new specific guidance and methodology is needed to avoid inconsistencies and contradictory findings when assessing economic value in SM.Objective: This article describes specific methodological challenges when conducting health economic (HE evaluations for SM interventions and outlines potential modifications necessary to existing evaluation guidelines /principles that would promote consistent economic evaluations for SM.Results/Conclusions: Specific methodological aspects for SM comprise considerations on the choice of comparator, measuring effectiveness and outcomes, appropriate modelling structure and the scope of sensitivity analyses. Although current HE methodology can be applied for SM, greater complexity requires further methodology development and modifications in the guidelines.
BIPOLAR MAGNETIC SPOTS FROM DYNAMOS IN STRATIFIED SPHERICAL SHELL TURBULENCE
Jabbari, Sarah; Brandenburg, Axel; Kleeorin, Nathan; Mitra, Dhrubaditya; Rogachevskii, Igor, E-mail: sarahjab@kth.se [Nordita, KTH Royal Institute of Technology and Stockholm University, Roslagstullsbacken 23, SE-10691 Stockholm (Sweden)
2015-06-01
Recent work by Mitra et al. (2014) has shown that in strongly stratified forced two-layer turbulence with helicity and corresponding large-scale dynamo action in the lower layer, and nonhelical turbulence in the upper, a magnetic field occurs in the upper layer in the form of sharply bounded bipolar magnetic spots. Here we extend this model to spherical wedge geometry covering the northern hemisphere up to 75° latitude and an azimuthal extent of 180°. The kinetic helicity and therefore also the large-scale magnetic field are strongest at low latitudes. For moderately strong stratification, several bipolar spots form that eventually fill the full longitudinal extent. At early times, the polarity of spots reflects the orientation of the underlying azimuthal field, as expected from Parker’s Ω-shaped flux loops. At late times their tilt changes such that there is a radial field of opposite orientation at different latitudes separated by about 10°. Our model demonstrates the spontaneous formation of spots of sizes much larger than the pressure scale height. Their tendency to produce filling factors close to unity is argued to be reminiscent of highly active stars. We confirm that strong stratification and strong scale separation are essential ingredients behind magnetic spot formation, which appears to be associated with downflows at larger depths.
Stratifying the Risk of Venous Thromboembolism in Otolaryngology
Shuman, Andrew G.; Hu, Hsou Mei; Pannucci, Christopher J.; Jackson, Christopher R.; Bradford, Carol R.; Bahl, Vinita
2015-01-01
Objective The consequences of perioperative venous thromboembolism (VTE) are devastating; identifying patients at risk is an essential step in reducing morbidity and mortality. The utility of perioperative VTE risk assessment in otolaryngology is unknown. This study was designed to risk-stratify a diverse population of otolaryngology patients for VTE events. Study Design Retrospective cohort study. Setting Single-institution academic tertiary care medical center. Subjects and Methods Adult patients presenting for otolaryngologic surgery requiring hospital admission from 2003 to 2010 who did not receive VTE chemoprophylaxis were included. The Caprini risk assessment was retrospectively scored via a validated method of electronic chart abstraction. Primary study variables were Caprini risk scores and the incidence of perioperative venous thromboembolic outcomes. Results A total of 2016 patients were identified. The overall 30-day rate of VTE was 1.3%. The incidence of VTE in patients with a Caprini risk score of 6 or less was 0.5%. For patients with scores of 7 or 8, the incidence was 2.4%. Patients with a Caprini risk score greater than 8 had an 18.3% incidence of VTE and were significantly more likely to develop a VTE when compared to patients with a Caprini risk score less than 8 (P otolaryngology patients for 30-day VTE events and allows otolaryngologists to identify patient subgroups who have a higher risk of VTE in the absence of chemoprophylaxis. PMID:22261490
Mixing efficiency of turbulent patches in stably stratified flows
Garanaik, Amrapalli; Venayagamoorthy, Subhas Karan
2016-11-01
A key quantity that is essential for estimating the turbulent diapycnal (irreversible) mixing in stably stratified flow is the mixing efficiency Rf*, which is a measure of the amount of turbulent kinetic energy that is irreversibly converted into background potential energy. In particular, there is an ongoing debate in the oceanographic mixing community regarding the utility of the buoyancy Reynolds number (Reb) , particularly with regard to how mixing efficiency and diapycnal diffusivity vary with Reb . Specifically, is there a universal relationship between the intensity of turbulence and the strength of the stratification that supports an unambiguous description of mixing efficiency based on Reb ? The focus of the present study is to investigate the variability of Rf* by considering oceanic turbulence data obtained from microstructure profiles in conjunction with data from laboratory experiments and DNS. Field data analysis has done by identifying turbulent patches using Thorpe sorting method for potential density. The analysis clearly shows that high mixing efficiencies can persist at high buoyancy Reynolds numbers. This is contradiction to previous studies which predict that mixing efficiency should decrease universally for Reb greater than O (100) . Funded by NSF and ONR.
Simulation and study of stratified flows around finite bodies
Gushchin, V. A.; Matyushin, P. V.
2016-06-01
The flows past a sphere and a square cylinder of diameter d moving horizontally at the velocity U in a linearly density-stratified viscous incompressible fluid are studied. The flows are described by the Navier-Stokes equations in the Boussinesq approximation. Variations in the spatial vortex structure of the flows are analyzed in detail in a wide range of dimensionless parameters (such as the Reynolds number Re = Ud/ ν and the internal Froude number Fr = U/( Nd), where ν is the kinematic viscosity and N is the buoyancy frequency) by applying mathematical simulation (on supercomputers of Joint Supercomputer Center of the Russian Academy of Sciences) and three-dimensional flow visualization. At 0.005 < Fr < 100, the classification of flow regimes for the sphere (for 1 < Re < 500) and for the cylinder (for 1 < Re < 200) is improved. At Fr = 0 (i.e., at U = 0), the problem of diffusion-induced flow past a sphere leading to the formation of horizontal density layers near the sphere's upper and lower poles is considered. At Fr = 0.1 and Re = 50, the formation of a steady flow past a square cylinder with wavy hanging density layers in the wake is studied in detail.
Towards Cost-efficient Sampling Methods
Peng, Luo; Yongli, Li; Chong, Wu
2014-01-01
The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...
Aldrich, R. C.; Dana, R. W.; Roberts, E. H. (Principal Investigator)
1977-01-01
The author has identified the following significant results. A stratified random sample using LANDSAT band 5 and 7 panchromatic prints resulted in estimates of water in counties with sampling errors less than + or - 9% (67% probability level). A forest inventory using a four band LANDSAT color composite resulted in estimates of forest area by counties that were within + or - 6.7% and + or - 3.7% respectively (67% probability level). Estimates of forest area for counties by computer assisted techniques were within + or - 21% of operational forest survey figures and for all counties the difference was only one percent. Correlations of airborne terrain reflectance measurements with LANDSAT radiance verified a linear atmospheric model with an additive (path radiance) term and multiplicative (transmittance) term. Coefficients of determination for 28 of the 32 modeling attempts, not adverseley affected by rain shower occurring between the times of LANDSAT passage and aircraft overflights, exceeded 0.83.
Walton, Karl; Blunt, Liam; Fleming, Leigh
2015-09-01
Mass finishing is amongst the most widely used finishing processes in modern manufacturing, in applications from deburring to edge radiusing and polishing. Processing objectives are varied, ranging from the cosmetic to the functionally critical. One such critical application is the hydraulically smooth polishing of aero engine component gas-washed surfaces. In this, and many other applications the drive to improve process control and finish tolerance is ever present. Considering its widespread use mass finishing has seen limited research activity, particularly with respect to surface characterization. The objectives of the current paper are to; characterise the mass finished stratified surface and its development process using areal surface parameters, provide guidance on the optimal parameters and sampling method to characterise this surface type for a given application, and detail the spatial variation in surface topography due to coupon edge shadowing. Blasted and peened square plate coupons in titanium alloy are wet (vibro) mass finished iteratively with increasing duration. Measurement fields are precisely relocated between iterations by fixturing and an image superimposition alignment technique. Surface topography development is detailed with ‘log of process duration’ plots of the ‘areal parameters for scale-limited stratified functional surfaces’, (the Sk family). Characteristic features of the Smr2 plot are seen to map out the processing of peak, core and dale regions in turn. These surface process regions also become apparent in the ‘log of process duration’ plot for Sq, where lower core and dale regions are well modelled by logarithmic functions. Surface finish (Ra or Sa) with mass finishing duration is currently predicted with an exponential model. This model is shown to be limited for the current surface type at a critical range of surface finishes. Statistical analysis provides a group of areal parameters including; Vvc, Sq, and Sdq
Jerzy Czerski
2014-02-01
Full Text Available The germination of whole seeds, the seeds without coat and isolated embryos of apple cv. "Antonówka Zwykła" after 90 days of cold-stratification was compared with the germination of embryos isolated from non-stratified seeds. They were germinated under 16hrs during a day at temperature 25°C and 20°C during the night. It has been found that after 2 weeks whole stratified seeds germinated in 5 per cent, seeds without coat in 25 per cent and isolated embryos in 98 per cent. Isolated embryos from nun-stratified seeds, after 2 weeks, germinated in the range from 75 to 88 per cent. The results indicate the similar germination ability of embryos isolated from nun-stratified seeds. The seedling populations obtained from embryo's stratified and non-stratified seeds were fully comparable and they evaluated: 1 a wide range of individual differences within population, 2 a similar number of seedlings in each class of shoot length, 3 a similar morphological habitus in each class of shoot length, 4 a similar fresh leaf weight and whole plant increment.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.