WorldWideScience

Sample records for model probability sets

  1. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  2. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik;

    2010-01-01

    the geology of e.g. a contaminated site, it is not always possible to gather enough information to build a representative geological model. Mapping in analogue geological settings and applying geostatistical tools to simulate spatial variability of heterogeneities can improve ordinary geological models...... that are predicated only on vertical borehole information. This study documents methods to map geological heterogeneity in clay till and ways to calibrate geostatistical models with field observations. A well-exposed cross-section in an excavation pit was used to measure and illustrate the occurrence and distribution...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...

  3. Estimating Route Choice Models from Stochastically Generated Choice Sets on Large-Scale Networks Correcting for Unequal Sampling Probability

    DEFF Research Database (Denmark)

    Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo

    2015-01-01

    is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...... extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... collected in Cagliari, Italy, shows the feasibility of generating routes stochastically in a high-resolution network and calculating the correction factor. The model estimation with and without correction illustrates how the correction not only improves the goodness of fit but also turns illogical signs...

  4. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  5. Probability state modeling theory.

    Science.gov (United States)

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  6. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  7. FUZZY SETS THEORY AS THE PART OF PROBABILITY THEORY

    OpenAIRE

    Orlov A. I.

    2013-01-01

    One of the key provisions of the system fuzzy interval mathematics - the claim that the theory of fuzzy sets is the part of the theory of random sets, thus, part of the probability theory. The article is devoted to the justification of this statement. Proved number of theorems that show that the fuzzy sets and the results of operations on them can be viewed as the projections of random sets and the results of the corresponding operations on them

  8. PROBABILITY MODEL OF GUNTHER GENERATOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.

  9. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  10. Sets, Probability and Statistics: The Mathematics of Life Insurance.

    Science.gov (United States)

    Clifford, Paul C.; And Others

    The practical use of such concepts as sets, probability and statistics are considered by many to be vital and necessary to our everyday life. This student manual is intended to familiarize students with these concepts and to provide practice using real life examples. It also attempts to illustrate how the insurance industry uses such mathematic…

  11. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  12. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  13. Calculation Model and Simulation of Warship Damage Probability

    Institute of Scientific and Technical Information of China (English)

    TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping

    2008-01-01

    The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.

  14. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  15. A consistent set of infinite-order probabilities

    NARCIS (Netherlands)

    Atkinson, David; Peijnenburg, Jeanne

    2013-01-01

    Some philosophers have claimed that it is meaningless or paradoxical to consider the probability of a probability. Others have however argued that second-order probabilities do not pose any particular problem. We side with the latter group. On condition that the relevant distinctions are taken into

  16. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  17. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  20. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  1. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  2. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  3. The Model Confidence Set

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.

    The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The M...

  4. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  5. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations betwee...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models.......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...

  6. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  7. Probability properties and fractal properties of statistically recursive sets

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper we construct a class of statistically recursive sets K by statistical contraction operators and prove the convergence and the measurability of K. Many important sets are the special cases of K. Then we investigate the statistically self-similar measure (or set). We have found some sufficient conditions to ensure the statistically recursive set to be statistically self-similar. We also investigate the distribution PK-1. The zero-one laws and the support of PK-1 are obtained.Finally the Hausdorff dimension and Hausdorff exact measure function of a class of statistically recursive sets constructed by a collection of i.i.d. statistical contraction operators have been obtained.

  8. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  9. Economic communication model set

    Science.gov (United States)

    Zvereva, Olga M.; Berg, Dmitry B.

    2017-06-01

    This paper details findings from the research work targeted at economic communications investigation with agent-based models usage. The agent-based model set was engineered to simulate economic communications. Money in the form of internal and external currencies was introduced into the models to support exchanges in communications. Every model, being based on the general concept, has its own peculiarities in algorithm and input data set since it was engineered to solve the specific problem. Several and different origin data sets were used in experiments: theoretic sets were estimated on the basis of static Leontief's equilibrium equation and the real set was constructed on the basis of statistical data. While simulation experiments, communication process was observed in dynamics, and system macroparameters were estimated. This research approved that combination of an agent-based and mathematical model can cause a synergetic effect.

  10. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  11. A Quantum Probability Model of Causal Reasoning

    Science.gov (United States)

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  12. A quantum probability model of causal reasoning.

    Science.gov (United States)

    Trueblood, Jennifer S; Busemeyer, Jerome R

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  13. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  14. Interval-valued probability on α-dominance relation rough set model based and their application%基于α-优势关系的区间值概率粗集模型及应用

    Institute of Scientific and Technical Information of China (English)

    何其慧; 罗文刚; 王翠翠; 孙丽; 毛军军

    2011-01-01

    区间值序信息系统是单值序信息系统的一种扩充。首先在区间值序信息系统中引出一种新的定义属性对象xj优于xi的概率Pjai,进而在此基础定义α-优势关系和优势类,从而定义了一种新的基于α-优势关系的概率粗糙集模型,继而通过相对熵赋权得到多属性决策问题的综合评价的最优解,最后对皖江城市带的经济发展的5年数据做定量分析,该实例有效地证明了该方法的合理性和科学性。%The interval-valued order information system is a generalized model of a single-valued order information system. At first, a novel definition is introduced based on the interval-valued order information system by defining pya which shows the probability of attribute object xj superior to object zz, and then α-dominance relation and dominance class are defined. So a novel probability rough set modal based on α-dominance relation is constructed. Furthermore, the optimal solution in multiple attributes making-decision has been obtained through the weights which are given by relative entropy. Finally, quantitative analysis of five years data for economic development issue of the cities along the Yangtze River in Anhui province has been made, and then the effectiveness and rationality of the above method are verified.

  15. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  16. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  17. A Unifying Field in Logics: Neutrosophic Logic, Neutrosophic Set, Neutrosophic Probability and Statistics (fourth edition)

    OpenAIRE

    2001-01-01

    In this book one makes an introduction to non-standard analysis in the first part, needed to the next four chapters in order to study the neutrosophics: 1. Neutrosophy - a new branch of philosophy. 2. Neutrosophic Logic - a unifying field in logics. 3. Neutrosophic Set - a unifying field in sets. 4. Neutrosophic Probability - a generalization of classical and imprecise probabilities - and Neutrosophic Statistics.

  18. Sets, Probability and Statistics: The Mathematics of Life Insurance. [Computer Program.] Second Edition.

    Science.gov (United States)

    King, James M.; And Others

    The materials described here represent the conversion of a highly popular student workbook "Sets, Probability and Statistics: The Mathematics of Life Insurance" into a computer program. The program is designed to familiarize students with the concepts of sets, probability, and statistics, and to provide practice using real life examples. It also…

  19. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Jim W.; Lawry, Jonathan

    2004-09-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM.

  20. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  1. Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals

    Indian Academy of Sciences (India)

    K R Parthasarathy

    2007-11-01

    By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.

  2. A Set of Axioms for the Utility Theory with Rational Probabilities

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Expected utility theory of Von Neumann-Morgenstern assumes that a preference order is defined for all lotteries (c1, p; c2, 1-p)(c1 with probability p, c2 with probability 1-p)for all real p, 0 ≤p ≤ 1. But when the probability p is irrational, it is hard to interpret thelottery intuitively. The utility theory of J. C. Shepherdson is introduced based on rationalprobabilities in this paper. And then, this paper studies the axioms proposed by J. C.Shepherdson, and puts forward a set of alternative axioms. At last, it is shown that both sets ofaxioms are equivalent.

  3. A probability model for the strength of carbon nanotubes

    Directory of Open Access Journals (Sweden)

    X. Frank Xu

    2014-07-01

    Full Text Available A longstanding controversy exists on the form of the probability distribution for the strength of carbon nanotubes: is it Weibull, lognormal, or something else? We present a theory for CNT strength through integration of weakest link scaling, flaw statistics, and brittle fracture. The probability distribution that arises exhibits multiple regimes, each of which takes the form of a Weibull distribution. Our model not only gives a possible resolution to the debate but provides a way to attain reliable estimates of CNT strength for materials design from practical-sized (non-asymptotic data sets of CNT strength. Last, the model offers an explanation for the severe underestimation of CNT strength from strength tests of CNT bundles.

  4. Level Set Segmentation of Medical Images Based on Local Region Statistics and Maximum a Posteriori Probability

    Directory of Open Access Journals (Sweden)

    Wenchao Cui

    2013-01-01

    Full Text Available This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP and Bayes’ rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  5. Probability boxes on totally preordered spaces for multivariate modelling

    CERN Document Server

    Troffaes, Matthias C M; 10.1016/j.ijar.2011.02.001

    2011-01-01

    A pair of lower and upper cumulative distribution functions, also called probability box or p-box, is among the most popular models used in imprecise probability theory. They arise naturally in expert elicitation, for instance in cases where bounds are specified on the quantiles of a random variable, or when quantiles are specified only at a finite number of points. Many practical and formal results concerning p-boxes already exist in the literature. In this paper, we provide new efficient tools to construct multivariate p-boxes and develop algorithms to draw inferences from them. For this purpose, we formalise and extend the theory of p-boxes using Walley's behavioural theory of imprecise probabilities, and heavily rely on its notion of natural extension and existing results about independence modeling. In particular, we allow p-boxes to be defined on arbitrary totally preordered spaces, hence thereby also admitting multivariate p-boxes via probability bounds over any collection of nested sets. We focus on t...

  6. Thermal bidirectional gap probability model for row crop canopies and validation

    Institute of Scientific and Technical Information of China (English)

    YAN; Guangjian(阎广建); IANG; Lingmei(蒋玲梅); WANG; Jindi(王锦地); CHEN; Liangfu(陈良富); LI; Xiaowen(李小文)

    2003-01-01

    Based on the row structure model of Kimes and the mean gap probability model in single direction, we develop a bidirectional gap probability model for row crop canopies. A concept of overlap index is introduced in this model to consider the gaps and their correlation between the sun and view directions. Multiangular thermal emission data sets were measured in Shunyi, Beijing, and these data are used in model validation in this paper. By comparison with the Kimes model that does not consider the gap probability, and the model considering the gap in view direction only, it is found that our bidirectional gap probability model fits the field measurements over winter wheat much better.

  7. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  8. Probability-summation model of multiple laser-exposure effects.

    Science.gov (United States)

    Menendez, A R; Cheney, F E; Zuclich, J A; Crump, P

    1993-11-01

    A probability-summation model is introduced to provide quantitative criteria for discriminating independent from interactive effects of multiple laser exposures on biological tissue. Data that differ statistically from predictions of the probability-summation model indicate the action of sensitizing (synergistic/positive) or desensitizing (hardening/negative) biophysical interactions. Interactions are indicated when response probabilities vary with changes in the spatial or temporal separation of exposures. In the absence of interactions, probability-summation parsimoniously accounts for "cumulative" effects. Data analyzed using the probability-summation model show instances of both sensitization and desensitization of retinal tissue by laser exposures. Other results are shown to be consistent with probability-summation. The relevance of the probability-summation model to previous laser-bioeffects studies, models, and safety standards is discussed and an appeal is made for improved empirical estimates of response probabilities for single exposures.

  9. Development and evaluation of probability density functions for a set of human exposure factors

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-06-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.

  10. An Empirical Comparison of Probability Models for Dependency Grammar

    CERN Document Server

    Eisner, J

    1997-01-01

    This technical report is an appendix to Eisner (1996): it gives superior experimental results that were reported only in the talk version of that paper. Eisner (1996) trained three probability models on a small set of about 4,000 conjunction-free, dependency-grammar parses derived from the Wall Street Journal section of the Penn Treebank, and then evaluated the models on a held-out test set, using a novel O(n^3) parsing algorithm. The present paper describes some details of the experiments and repeats them with a larger training set of 25,000 sentences. As reported at the talk, the more extensive training yields greatly improved performance. Nearly half the sentences are parsed with no misattachments; two-thirds are parsed with at most one misattachment. Of the models described in the original written paper, the best score is still obtained with the generative (top-down) "model C." However, slightly better models are also explored, in particular, two variants on the comprehension (bottom-up) "model B." The be...

  11. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  12. Stochastic population dynamic models as probability networks

    Science.gov (United States)

    M.E. and D.C. Lee. Borsuk

    2009-01-01

    The dynamics of a population and its response to environmental change depend on the balance of birth, death and age-at-maturity, and there have been many attempts to mathematically model populations based on these characteristics. Historically, most of these models were deterministic, meaning that the results were strictly determined by the equations of the model and...

  13. Probability and Statistics in Sensor Performance Modeling

    Science.gov (United States)

    2010-12-01

    transformed Rice- Nakagami distribution ......................................................................... 49 Report Documentation Page...acoustic or electromagnetic waves are scattered by both objects and turbulent wind. A version of the Rice- Nakagami model (specifically with a...Gaussian, lognormal, exponential, gamma, and the 2XX → transformed Rice- Nakagami —as well as a discrete model. (Other examples of statistical models

  14. Why does Japan use the probability method to set design flood?

    Science.gov (United States)

    Nakamura, S.; Oki, T.

    2015-12-01

    Design flood is hypothetical flood to make flood prevention plan. In Japan, a probability method based on precipitation data is used to define the scale of design flood: Tone River, the biggest river in Japan, is 1 in 200 years, Shinano River is 1 in 150 years, and so on. It is one of important socio-hydrological issue how to set reasonable and acceptable design flood in a changing world. The method to set design flood vary among countries. Although the probability method is also used in Netherland, but the base data is water level or discharge data and the probability is 1 in 1250 years (in fresh water section). On the other side, USA and China apply the maximum flood method which set the design flood based on the historical or probable maximum flood. This cases can leads a question: "what is the reason why the method vary among countries?" or "why does Japan use the probability method?" The purpose of this study is to clarify the historical process which the probability method was developed in Japan based on the literature. In the late 19the century, the concept of "discharge" and modern river engineering were imported by Dutch engineers, and modern flood prevention plans were developed in Japan. In these plans, the design floods were set based on the historical maximum method. Although the historical maximum method had been used until World War 2, however, the method was changed to the probability method after the war because of limitations of historical maximum method under the specific socio-economic situations: (1) the budget limitation due to the war and the GHQ occupation, (2) the historical floods: Makurazaki typhoon in 1945, Kathleen typhoon in 1947, Ione typhoon in 1948, and so on, attacked Japan and broke the record of historical maximum discharge in main rivers and the flood disasters made the flood prevention projects difficult to complete. Then, Japanese hydrologists imported the hydrological probability statistics from the West to take account of

  15. Statistical Analysis of Probability of Detection Hit/Miss Data for Small Data Sets

    Science.gov (United States)

    Harding, C. A.; Hugo, G. R.

    2003-03-01

    This paper examines the validity of statistical methods for determining nondestructive inspection probability of detection (POD) curves from relatively small hit/miss POD data sets. One method published in the literature is shown to be invalid for analysis of POD hit/miss data. Another standard method is shown to be valid only for data sets containing more than 200 observations. An improved method is proposed which allows robust lower 95% confidence limit POD curves to be determined from data sets containing as few as 50 hit/miss observations.

  16. Introducing the Core Probability Framework and Discrete-Element Core Probability Model for efficient stochastic macroscopic modelling

    NARCIS (Netherlands)

    Calvert, S.C.; Taale, H.; Hoogendoorn, S.P.

    2014-01-01

    In this contribution the Core Probability Framework (CPF) is introduced with the application of the Discrete-Element Core Probability Model (DE-CPM) as a new DNL for dynamic macroscopic modelling of stochastic traffic flow. The model is demonstrated for validation in a test case and for computationa

  17. A Hierarchical Probability Model of Colon Cancer

    CERN Document Server

    Kelly, Michael

    2010-01-01

    We consider a model of fixed size $N = 2^l$ in which there are $l$ generations of daughter cells and a stem cell. In each generation $i$ there are $2^{i-1}$ daughter cells. At each integral time unit the cells split so that the stem cell splits into a stem cell and generation 1 daughter cell and the generation $i$ daughter cells become two cells of generation $i+1$. The last generation is removed from the population. The stem cell gets first and second mutations at rates $u_1$ and $u_2$ and the daughter cells get first and second mutations at rates $v_1$ and $v_2$. We find the distribution for the time it takes to get two mutations as $N$ goes to infinity and the mutation rates go to 0. We also find the distribution for the location of the mutations. Several outcomes are possible depending on how fast the rates go to 0. The model considered has been proposed by Komarova (2007) as a model for colon cancer.

  18. A Probability-Based Hybrid User Model for Recommendation System

    Directory of Open Access Journals (Sweden)

    Jia Hao

    2016-01-01

    Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.

  19. PROBABILITY MODELS FOR OBTAINING NON-NUMERICAL DATA

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-01-01

    Full Text Available The statistics of objects of non-numerical nature (statistics of non-numerical objects, non-numerical data statistics, non-numeric statistics is the area of mathematical statistics, devoted to the analysis methods of non-numeric data. Basis of applying the results of mathematical statistics are probabilistic-statistical models of real phenomena and processes, the most important (and often only which are models for obtaining data. The simplest example of a model for obtaining data is the model of the sample as a set of independent identically distributed random variables. In this article we have considered the basic probabilistic models for obtaining non-numeric data. Namely, the models of dichotomous data, results of paired comparisons, binary relations, ranks, the objects of general nature. We have discussed the various options of probabilistic models and their practical use. For example, the basic probabilistic model of dichotomous data - Bernoulli vector (Lucian i.e. final sequence of independent Bernoulli trials, for which the probabilities of success may be different. The mathematical tools of solutions of various statistical problems associated with the Bernoulli vectors are useful for the analysis of random tolerances; random sets with independent elements; in processing the results of independent pairwise comparisons; statistical methods for analyzing the accuracy and stability of technological processes; in the analysis and synthesis of statistical quality control plans (for dichotomous characteristics; the processing of marketing and sociological questionnaires (with closed questions like "yes" - "no"; the processing of socio-psychological and medical data, in particular, the responses to psychological tests such as MMPI (used in particular in the problems of human resource management, and analysis of topographic maps (used for the analysis and prediction of the affected areas for technological disasters, distributing corrosion

  20. Probability density function modeling for sub-powered interconnects

    Science.gov (United States)

    Pater, Flavius; Amaricǎi, Alexandru

    2016-06-01

    This paper proposes three mathematical models for reliability probability density function modeling the interconnect supplied at sub-threshold voltages: spline curve approximations, Gaussian models,and sine interpolation. The proposed analysis aims at determining the most appropriate fitting for the switching delay - probability of correct switching for sub-powered interconnects. We compare the three mathematical models with the Monte-Carlo simulations of interconnects for 45 nm CMOS technology supplied at 0.25V.

  1. EVOLVE : a Bridge between Probability, Set Oriented Numerics and Evolutionary Computation

    CERN Document Server

    Tantar, Alexandru-Adrian; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; Coello, Carlos; Schütze, Oliver; EVOLVE 2011

    2013-01-01

    The aim of this book is to provide a strong theoretical support for understanding and analyzing the behavior of evolutionary algorithms, as well as for creating a bridge between probability, set-oriented numerics and evolutionary computation. The volume encloses a collection of contributions that were presented at the EVOLVE 2011 international workshop, held in Luxembourg, May 25-27, 2011, coming from invited speakers and also from selected regular submissions. The aim of EVOLVE is to unify the perspectives offered by probability, set oriented numerics and evolutionary computation. EVOLVE focuses on challenging aspects that arise at the passage from theory to new paradigms and practice, elaborating on the foundations of evolutionary algorithms and theory-inspired methods merged with cutting-edge techniques that ensure performance guarantee factors. EVOLVE is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. The chapters enclose challenging theoret...

  2. A Unifying Field in Logics: Neutrosophic Logic. Neutrosophy, Neutrosophic Set, Neutrosophic Probability (fifth edition)

    OpenAIRE

    Smarandache, Florentin

    2006-01-01

    The neutrosophy, neutrosophic set, neutrosophic logic, neutrosophic probability, neutrosophic statistics etc. were introduced by Florentin Smarandache in 1995. 1. Neutrosophy is a new branch of philosophy that studies the origin, nature, and scope of neutralities, as well as their interactions with different ideational spectra. This theory considers every notion or idea together with its opposite or negation and with their spectrum of neutralities ...

  3. On the structure of the set of probable Earth-impact trajectories for the asteroid Apophis in 2036

    Science.gov (United States)

    Ivashkin, V. V.; Stikhno, C. A.; Guo, P.

    2017-08-01

    The probable set of Earth-collision trajectories for the asteroid Apophis in 2036 has been determined. The characteristics of these trajectories and the corresponding probable asteroid impact zone on the Earth have been analyzed.

  4. Review of Literature for Model Assisted Probability of Detection

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Susan L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lareau, John P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Anderson, Michael T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-09-30

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  5. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  6. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...

  7. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  8. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  9. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  11. Probability bounds analysis for nonlinear population ecology models.

    Science.gov (United States)

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals.

  12. The Survival Probability in Generalized Poisson Risk Model

    Institute of Scientific and Technical Information of China (English)

    GONGRi-zhao

    2003-01-01

    In this paper we generalize the aggregated premium income process from a constant rate process to a poisson process for the classical compound Poinsson risk model,then for the generalized model and the classical compound poisson risk model ,we respectively get its survival probability in finite time period in case of exponential claim amounts.

  13. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  14. Selection of probability based weighting models for Boolean retrieval system

    Energy Technology Data Exchange (ETDEWEB)

    Ebinuma, Y. (Japan Atomic Energy Research Inst., Tokai, Ibaraki. Tokai Research Establishment)

    1981-09-01

    Automatic weighting models based on probability theory were studied if they can be applied to boolean search logics including logical sum. The INIS detabase was used for searching of one particular search formula. Among sixteen models three with good ranking performance were selected. These three models were further applied to searching of nine search formulas in the same database. It was found that two models among them show slightly better average ranking performance while the other model, the simplest one, seems also practical.

  15. Center—Distance Continuous Probability Models and the Distance Measure

    Institute of Scientific and Technical Information of China (English)

    郑方; 吴文虎; 等

    1998-01-01

    In this paper,a new statistic model named Center-Distance Continuous Probability Model(CDCPM)for speech recognition is described,which is based on Center-Distance Normal(CDN)distribution.In a CDCPM,the probability transition matrix is omitted,and the observation probability density function(PDF)in each state is in the form of embedded multiple-model(EMM)based on the Nearest Neighbour rule.The experimental results on two giant real-world Chinese speech databases and a real-world continuous-manner 2000 phrase system show that this model is a powerful one.Also,a distance measure for CDPMs is proposed which is based on the Bayesian minimum classification error(MCE) discrimination.

  16. Approximating model probabilities in Bayesian information criterion and decision-theoretic approaches to model selection in phylogenetics.

    Science.gov (United States)

    Evans, Jason; Sullivan, Jack

    2011-01-01

    A priori selection of models for use in phylogeny estimation from molecular sequence data is increasingly important as the number and complexity of available models increases. The Bayesian information criterion (BIC) and the derivative decision-theoretic (DT) approaches rely on a conservative approximation to estimate the posterior probability of a given model. Here, we extended the DT method by using reversible jump Markov chain Monte Carlo approaches to directly estimate model probabilities for an extended candidate pool of all 406 special cases of the general time reversible + Γ family. We analyzed 250 diverse data sets in order to evaluate the effectiveness of the BIC approximation for model selection under the BIC and DT approaches. Model choice under DT differed between the BIC approximation and direct estimation methods for 45% of the data sets (113/250), and differing model choice resulted in significantly different sets of trees in the posterior distributions for 26% of the data sets (64/250). The model with the lowest BIC score differed from the model with the highest posterior probability in 30% of the data sets (76/250). When the data indicate a clear model preference, the BIC approximation works well enough to result in the same model selection as with directly estimated model probabilities, but a substantial proportion of biological data sets lack this characteristic, which leads to selection of underparametrized models.

  17. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  18. Dependent sets of a family of relations of full measure on a probability space

    Institute of Scientific and Technical Information of China (English)

    Jin-cheng XIONG; Feng TAN; Jie L(U)

    2007-01-01

    For a probability space (X, (B), μ) a subfamily (F) of the σ-algebra (B) is said to be a regular base if every B ∈ (B) can be arbitrarily approached by some member of (F) which contains B in the sense of the measure theory. Assume that {Rγ}γ∈r is a countable family of relations of the full measure on a probability space (X, (B), μ), i.e. for every γ ∈ Г there is a positive integer sγ such that Rγ (∩) Xsγ with μsγ(Rγ) = 1. In the present paper we show that if (X, (B), μ) has a regular base, the cardinality of which is not greater than the cardinality of the continuum, then there exists a set K (∩) X with μ*(K) = 1 such that (x1,...,xsγ) ∈Rγ for any γ ∈ Г and for any sγ distinct elements x1,... ,xsγ of K, where μ* is the outer measure induced by the measure μ. Moreover, an application of the result mentioned above is given to the dynamical systems determined by the iterates of measure-preserving transformations.

  19. Dependent sets of a family of relations of full measure on a probability space

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    For a probability space (X, B,μ) a subfamily F of theσ-algebra B is said to be a regular base if every B∈B can be arbitrarily approached by some member of F which contains B in the sense of the measure theory. Assume that {γr}γ∈Γis a countable family of relations of the full measure on a probability space (X,B,μ), i.e. for everyγ∈Γthere is a positive integer sγsuch that Rγ(?)Xsγwithμsγ(Rγ) = 1. In the present paper we show that if (X, B,μ) has a regular base, the cardinality of which is not greater than the cardinality of the continuum, then there exists a set K(?)X withμ*(K) = 1 such that (x1,...,xsγ)∈γr for anyγ∈Γand for any sγdistinct elements x1,..., xsγof K, whereμ* is the outer measure induced by the measureμ. Moreover, an application of the result mentioned above is given to the dynamical systems determined by the iterates of measure-preserving transformations.

  20. Gap probability - Measurements and models of a pecan orchard

    Science.gov (United States)

    Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI

    1992-01-01

    Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.

  1. Survival probability for chaotic particles in a set of area preserving maps

    Science.gov (United States)

    de Oliveira, Juliano A.; da Costa, Diogo R.; Leonel, Edson D.

    2016-11-01

    We found critical exponents for the dynamics of an ensemble of particles described by a family of Hamiltonian mappings by using the formalism of escape rates. The mappings are described by a canonical pair of variables, say action J and angle θ and the corresponding phase spaces show a large chaotic sea surrounding periodic islands and limited by a set of invariant spanning curves. When a hole is introduced in the dynamical variable action, the histogram for the frequency of escape of particles grows rapidly until reaches a maximum and then decreases towards zero for long enough time. The survival probability of the particles as a function of time is measured and statistical investigations show it is scaling invariant with respect to γ and time for chaotic orbits along the phase space.

  2. Illustrating Probability through Roulette: A Spreadsheet Simulation Model

    Directory of Open Access Journals (Sweden)

    Kala Chand Seal

    2005-11-01

    Full Text Available Teaching probability can be challenging because the mathematical formulas often are too abstract and complex for the students to fully grasp the underlying meaning and effect of the concepts. Games can provide a way to address this issue. For example, the game of roulette can be an exciting application for teaching probability concepts. In this paper, we implement a model of roulette in a spreadsheet that can simulate outcomes of various betting strategies. The simulations can be analyzed to gain better insights into the corresponding probability structures. We use the model to simulate a particular betting strategy known as the bet-doubling, or Martingale, strategy. This strategy is quite popular and is often erroneously perceived as a winning strategy even though the probability analysis shows that such a perception is incorrect. The simulation allows us to present the true implications of such a strategy for a player with a limited betting budget and relate the results to the underlying theoretical probability structure. The overall validation of the model, its use for teaching, including its application to analyze other types of betting strategies are discussed.

  3. Predicting the Probability of Lightning Occurrence with Generalized Additive Models

    Science.gov (United States)

    Fabsic, Peter; Mayr, Georg; Simon, Thorsten; Zeileis, Achim

    2017-04-01

    This study investigates the predictability of lightning in complex terrain. The main objective is to estimate the probability of lightning occurrence in the Alpine region during summertime afternoons (12-18 UTC) at a spatial resolution of 64 × 64 km2. Lightning observations are obtained from the ALDIS lightning detection network. The probability of lightning occurrence is estimated using generalized additive models (GAM). GAMs provide a flexible modelling framework to estimate the relationship between covariates and the observations. The covariates, besides spatial and temporal effects, include numerous meteorological fields from the ECMWF ensemble system. The optimal model is chosen based on a forward selection procedure with out-of-sample mean squared error as a performance criterion. Our investigation shows that convective precipitation and mid-layer stability are the most influential meteorological predictors. Both exhibit intuitive, non-linear trends: higher values of convective precipitation indicate higher probability of lightning, and large values of the mid-layer stability measure imply low lightning potential. The performance of the model was evaluated against a climatology model containing both spatial and temporal effects. Taking the climatology model as a reference forecast, our model attains a Brier Skill Score of approximately 46%. The model's performance can be further enhanced by incorporating the information about lightning activity from the previous time step, which yields a Brier Skill Score of 48%. These scores show that the method is able to extract valuable information from the ensemble to produce reliable spatial forecasts of the lightning potential in the Alps.

  4. LAPLACE TRANSFORM OF THE SURVIVAL PROBABILITY UNDER SPARRE ANDERSEN MODEL

    Institute of Scientific and Technical Information of China (English)

    Sun Chuanguang

    2007-01-01

    In this paper a class of risk processes in which claims occur as a renewal process is studied. A clear expression for Laplace transform of the survival probability is well given when the claim amount distribution is Erlang distribution or mixed Erlang distribution. The expressions for moments of the time to ruin with the model above are given.

  5. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...

  6. Computation of Probabilities in Causal Models of History of Science

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2006-12-01

    Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.

  7. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  8. A propagation model of computer virus with nonlinear vaccination probability

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi

    2014-01-01

    This paper is intended to examine the effect of vaccination on the spread of computer viruses. For that purpose, a novel computer virus propagation model, which incorporates a nonlinear vaccination probability, is proposed. A qualitative analysis of this model reveals that, depending on the value of the basic reproduction number, either the virus-free equilibrium or the viral equilibrium is globally asymptotically stable. The results of simulation experiments not only demonstrate the validity of our model, but also show the effectiveness of nonlinear vaccination strategies. Through parameter analysis, some effective strategies for eradicating viruses are suggested.

  9. An Integrated Modeling Framework for Probable Maximum Precipitation and Flood

    Science.gov (United States)

    Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.

    2015-12-01

    With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.

  10. Probability Distribution Function of Passive Scalars in Shell Models

    Institute of Scientific and Technical Information of China (English)

    LIU Chun-Ping; ZHANG Xiao-Qiang; LIU Yu-Rong; WANG Guang-Rui; HE Da-Ren; CHEN Shi-Gang; ZHU Lu-Jin

    2008-01-01

    A shell-model version of passive scalar problem is introduced, which is inspired by the model of K. Ohkitani and M. Yakhot [K. Ohkitani and M. Yakhot, Phys. Rev. Lett. 60 (1988) 983; K. Ohkitani and M. Yakhot, Prog. Theor. Phys. 81 (1988) 329]. As in the original problem, the prescribed random velocity field is Gaussian and 5 correlated in time. Deterministic differential equations are regarded as nonlinear Langevin equation. Then, the Fokker-Planck equations of PDF for passive scalars axe obtained and solved numerically. In energy input range (n < 5, n is the shell number.), the probability distribution function (PDF) of passive scalars is near the Gaussian distribution. In inertial range (5 < n < 16) and dissipation range (n ≥ 17), the probability distribution function (PDF) of passive scalars has obvious intermittence. And the scaling power of passive scalar is anomalous. The results of numerical simulations are compared with experimental measurements.

  11. A model to assess dust explosion occurrence probability.

    Science.gov (United States)

    Hassan, Junaid; Khan, Faisal; Amyotte, Paul; Ferdous, Refaul

    2014-03-15

    Dust handling poses a potential explosion hazard in many industrial facilities. The consequences of a dust explosion are often severe and similar to a gas explosion; however, its occurrence is conditional to the presence of five elements: combustible dust, ignition source, oxidant, mixing and confinement. Dust explosion researchers have conducted experiments to study the characteristics of these elements and generate data on explosibility. These experiments are often costly but the generated data has a significant scope in estimating the probability of a dust explosion occurrence. This paper attempts to use existing information (experimental data) to develop a predictive model to assess the probability of a dust explosion occurrence in a given environment. The pro-posed model considers six key parameters of a dust explosion: dust particle diameter (PD), minimum ignition energy (MIE), minimum explosible concentration (MEC), minimum ignition temperature (MIT), limiting oxygen concentration (LOC) and explosion pressure (Pmax). A conditional probabilistic approach has been developed and embedded in the proposed model to generate a nomograph for assessing dust explosion occurrence. The generated nomograph provides a quick assessment technique to map the occurrence probability of a dust explosion for a given environment defined with the six parameters.

  12. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  14. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  15. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  16. Integer Set Compression and Statistical Modeling

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...... enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities...

  17. Probability Model for Data Redundancy Detection in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Suman Kumar

    2009-01-01

    Full Text Available Sensor networks are made of autonomous devices that are able to collect, store, process and share data with other devices. Large sensor networks are often redundant in the sense that the measurements of some nodes can be substituted by other nodes with a certain degree of confidence. This spatial correlation results in wastage of link bandwidth and energy. In this paper, a model for two associated Poisson processes, through which sensors are distributed in a plane, is derived. A probability condition is established for data redundancy among closely located sensor nodes. The model generates a spatial bivariate Poisson process whose parameters depend on the parameters of the two individual Poisson processes and on the distance between the associated points. The proposed model helps in building efficient algorithms for data dissemination in the sensor network. A numerical example is provided investigating the advantage of this model.

  18. Spatial occupancy models for large data sets

    Science.gov (United States)

    Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.

    2013-01-01

    Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108 000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.

  19. Can quantum probability provide a new direction for cognitive modeling?

    Science.gov (United States)

    Pothos, Emmanuel M; Busemeyer, Jerome R

    2013-06-01

    Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory share the fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality.

  20. Modelling Soft Error Probability in Firmware: A Case Study

    Directory of Open Access Journals (Sweden)

    DG Kourie

    2012-06-01

    Full Text Available This case study involves an analysis of firmware that controls explosions in mining operations. The purpose is to estimate the probability that external disruptive events (such as electro-magnetic interference could drive the firmware into a state which results in an unintended explosion. Two probabilistic models are built, based on two possible types of disruptive events: a single spike of interference, and a burst of multiple spikes of interference.The models suggest that the system conforms to the IEC 61508 Safety Integrity Levels, even under very conservative assumptions of operation.The case study serves as a platform for future researchers to build on when probabilistic modelling soft errors in other contexts.

  1. Modeling evolution using the probability of fixation: history and implications.

    Science.gov (United States)

    McCandlish, David M; Stoltzfus, Arlin

    2014-09-01

    Many models of evolution calculate the rate of evolution by multiplying the rate at which new mutations originate within a population by a probability of fixation. Here we review the historical origins, contemporary applications, and evolutionary implications of these "origin-fixation" models, which are widely used in evolutionary genetics, molecular evolution, and phylogenetics. Origin-fixation models were first introduced in 1969, in association with an emerging view of "molecular" evolution. Early origin-fixation models were used to calculate an instantaneous rate of evolution across a large number of independently evolving loci; in the 1980s and 1990s, a second wave of origin-fixation models emerged to address a sequence of fixation events at a single locus. Although origin fixation models have been applied to a broad array of problems in contemporary evolutionary research, their rise in popularity has not been accompanied by an increased appreciation of their restrictive assumptions or their distinctive implications. We argue that origin-fixation models constitute a coherent theory of mutation-limited evolution that contrasts sharply with theories of evolution that rely on the presence of standing genetic variation. A major unsolved question in evolutionary biology is the degree to which these models provide an accurate approximation of evolution in natural populations.

  2. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  3. Recent Advances in Model-Assisted Probability of Detection

    Science.gov (United States)

    Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.

    2009-01-01

    The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.

  4. A cellular automata model with probability infection and spatial dispersion

    Institute of Scientific and Technical Information of China (English)

    Jin Zhen; Liu Quan-Xing; Mainul Haque

    2007-01-01

    In this article, we have proposed an epidemic model based on the probability cellular automata theory. The essential mathematical features are analysed with the help of stability theory. We have given an alternative modelling approach for the spatiotemporal system which is more realistic from the practical point of view. A discrete and spatiotemporal approach is shown by using cellular automata theory. It is interesting to note that both the size of the endemic equilibrium and the density of the individuals increase with the increase of the neighbourhood size and infection rate, but the infections decrease with the increase of the recovery rate. The stability of the system around the positive interior equilibrium has been shown by using a suitable Lyapunov function. Finally, experimental data simulation for SARS disease in China in 2003 and a brief discussion are given.

  5. Probability of detection models for eddy current NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Rajesh, S.N.

    1993-04-30

    The development of probability of detection (POD) models for a variety of nondestructive evaluation (NDE) methods is motivated by a desire to quantify the variability introduced during the process of testing. Sources of variability involved in eddy current methods of NDE include those caused by variations in liftoff, material properties, probe canting angle, scan format, surface roughness and measurement noise. This thesis presents a comprehensive POD model for eddy current NDE. Eddy current methods of nondestructive testing are used widely in industry to inspect a variety of nonferromagnetic and ferromagnetic materials. The development of a comprehensive POD model is therefore of significant importance. The model incorporates several sources of variability characterized by a multivariate Gaussian distribution and employs finite element analysis to predict the signal distribution. The method of mixtures is then used for estimating optimal threshold values. The research demonstrates the use of a finite element model within a probabilistic framework to the spread in the measured signal for eddy current nondestructive methods. Using the signal distributions for various flaw sizes the POD curves for varying defect parameters have been computed. In contrast to experimental POD models, the cost of generating such curves is very low and complex defect shapes can be handled very easily. The results are also operator independent.

  6. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  7. CONFIDENCE LOWER LIMITS FOR RESPONSE PROBABILITIES UNDER THE LOGISTIC RESPONSE MODEL

    Institute of Scientific and Technical Information of China (English)

    TIAN Yubin; LI Guoying; YANG Jie

    2004-01-01

    The lower confidence limits for response probabilities based on binary response data under the logistic response model are considered by saddlepoint approach. The high order approximation to the conditional distribution of a statistic for an interested parameter and then the lower confidence limits of response probabilities are derived. A simulation comparing these lower confidence limits with those obtained from the asymptotic normality is conducted. The proposed approximation is applied to two real data sets. Numerical results show that the saddlepoint approximations are much more accurate than the asymptotic normality approximations, especially for the cases of small or moderate sample sizes.

  8. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  9. Dark matter halo merger and accretion probabilities in the excursion set formalism

    CERN Document Server

    Alizadeh, Esfandiar

    2008-01-01

    The merger and accretion probabilities of dark matter halos have so far only been calculated for an infinitesimal time interval. This means that a Monte-Carlo simulation with very small time steps is necessary to find the merger history of a parent halo. In this paper we use the random walk formalism to find the merger and accretion probabilities of halos for a finite time interval. Specifically, we find the number density of halos at an early redshift that will become part of a halo with a specified final mass at a later redshift, given that they underwent $n$ major mergers, $n=0,1,2,...$ . We reduce the problem into an integral equation which we then solve numerically. To ensure the consistency of our formalism we compare the results with Monte-Carlo simulations and find very good agreement. Though we have done our calculation assuming a flat barrier, the more general case can easily be handled using our method. This derivation of finite time merger and accretion probabilities can be used to make more effic...

  10. Detecting Gustatory–Olfactory Flavor Mixtures: Models of Probability Summation

    Science.gov (United States)

    Veldhuizen, Maria G.; Shepard, Timothy G.; Shavit, Adam Y.

    2012-01-01

    Odorants and flavorants typically contain many components. It is generally easier to detect multicomponent stimuli than to detect a single component, through either neural integration or probability summation (PS) (or both). PS assumes that the sensory effects of 2 (or more) stimulus components (e.g., gustatory and olfactory components of a flavorant) are detected in statistically independent channels, that each channel makes a separate decision whether a component is detected, and that the behavioral response depends solely on the separate decisions. Models of PS traditionally assume high thresholds for detecting each component, noise being irrelevant. The core assumptions may be adapted, however, to signal-detection theory, where noise limits detection. The present article derives predictions of high-threshold and signal-detection models of independent-decision PS in detecting gustatory–olfactory flavorants, comparing predictions in yes/no and 2-alternative forced-choice tasks using blocked and intermixed stimulus designs. The models also extend to measures of response times to suprathreshold flavorants. Predictions derived from high-threshold and signal-detection models differ markedly. Available empirical evidence on gustatory–olfactory flavor detection suggests that neither the high-threshold nor the signal-detection versions of PS can readily account for the results, which likely reflect neural integration in the flavor system. PMID:22075720

  11. Low-probability flood risk modeling for New York City.

    Science.gov (United States)

    Aerts, Jeroen C J H; Lin, Ning; Botzen, Wouter; Emanuel, Kerry; de Moel, Hans

    2013-05-01

    The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low-lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low-probability/high-impact flood hazard faced by the city. Exceedance probability-loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100-year storm surge is within a range of US$2 bn-5 bn, while this is between US$5 bn and 11 bn for a 1/500-year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.

  12. A fault tree model to assess probability of contaminant discharge from shipwrecks.

    Science.gov (United States)

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I

    2014-11-15

    Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation.

  13. Estimation of State Transition Probabilities: A Neural Network Model

    Science.gov (United States)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  14. A method to calculate coverage probability from uncertainties in radiotherapy via a statistical shape model.

    Science.gov (United States)

    Price, G J; Moore, C J

    2007-04-07

    In this paper we describe a technique that may be used to model the geometric uncertainties that accrue during the radiotherapy process. Using data from in-treatment cone beam CT scans, we simultaneously analyse non-uniform observer delineation variability and organ motion together with patient set-up errors via the creation of a point distribution model (PDM). We introduce a novel method of generating a coverage probability matrix, that may be used to determine treatment margins and calculate uncertainties in dose, from this statistical shape model. The technique does not assume rigid body motion and can extrapolate shape variability in a statistically meaningful manner. In order to construct the PDM, we generate corresponding surface points over a set of delineations. Correspondences are established at a set of points in parameter space on spherically parameterized and canonical aligned outlines. The method is demonstrated using rectal delineations from serially acquired in-treatment cone beam CT image volumes of a prostate patient (44 image volumes total), each delineated by a minimum of two observers (maximum six). Two PDMs are constructed, one with set-up errors included and one without. We test the normality assumptions of the PDMs and find the distributions to be Gaussian in nature. The rectal PDM variability is in general agreement with data in the literature. The two resultant coverage probability matrices show differences as expected.

  15. Emptiness and depletion formation probability in spin models with inverse square interaction

    Science.gov (United States)

    Franchini, Fabio; Kulkarni, Manas

    2010-02-01

    We calculate the Emptiness Formation Probability (EFP) in the spin-Calogero Model (sCM) and Haldane-Shastry Model (HSM) using their hydrodynamic description. The EFP is the probability that a region of space is completely void of particles in the ground state of a quantum many body system. We calculate this probability in an instanton approach, by considering the more general problem of an arbitrary depletion of particles (DFP). In the limit of large size of depletion region the probability is dominated by a classical configuration in imaginary time that satisfies a set of boundary conditions and the action calculated on such solution gives the EFP/DFP with exponential accuracy. We show that the calculation for sCM can be elegantly performed by representing the gradientless hydrodynamics of spin particles as a sum of two spin-less Calogero collective field theories in auxiliary variables. Interestingly, the result we find for the EFP can be casted in a form reminiscing of spin-charge separation, which should be violated for a non-linear effect such as this. We also highlight the connections between sCM, HSM and λ=2 spin-less Calogero model from a EFP/DFP perspective.

  16. Model and test in a fungus of the probability that beneficial mutations survive drift

    NARCIS (Netherlands)

    Gifford, D.R.; Visser, de J.A.G.M.; Wahl, L.M.

    2013-01-01

    Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and t

  17. Model and test in a fungus of the probability that beneficial mutations survive drift

    NARCIS (Netherlands)

    Gifford, D.R.; Visser, de J.A.G.M.; Wahl, L.M.

    2013-01-01

    Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and

  18. Inferring tree causal models of cancer progression with probability raising.

    Directory of Open Access Journals (Sweden)

    Loes Olde Loohuis

    Full Text Available Existing techniques to reconstruct tree models of progression for accumulative processes, such as cancer, seek to estimate causation by combining correlation and a frequentist notion of temporal priority. In this paper, we define a novel theoretical framework called CAPRESE (CAncer PRogression Extraction with Single Edges to reconstruct such models based on the notion of probabilistic causation defined by Suppes. We consider a general reconstruction setting complicated by the presence of noise in the data due to biological variation, as well as experimental or measurement errors. To improve tolerance to noise we define and use a shrinkage-like estimator. We prove the correctness of our algorithm by showing asymptotic convergence to the correct tree under mild constraints on the level of noise. Moreover, on synthetic data, we show that our approach outperforms the state-of-the-art, that it is efficient even with a relatively small number of samples and that its performance quickly converges to its asymptote as the number of samples increases. For real cancer datasets obtained with different technologies, we highlight biologically significant differences in the progressions inferred with respect to other competing techniques and we also show how to validate conjectured biological relations with progression models.

  19. Blocking probability in the hose-model optical VPN with different number of wavelengths

    Science.gov (United States)

    Roslyakov, Alexander V.

    2017-04-01

    Connection setup with guaranteed quality of service (QoS) in the optical virtual private network (OVPN) is a major goal for the network providers. In order to support this we propose a QoS based OVPN connection set up mechanism over WDM network to the end customer. The proposed WDM network model can be specified in terms of QoS parameter such as blocking probability. We estimated this QoS parameter based on the hose-model OVPN. In this mechanism the OVPN connections also can be created or deleted according to the availability of the wavelengths in the optical path. In this paper we have considered the impact of the number of wavelengths on the computation of blocking probability. The goal of the work is to dynamically provide a best OVPN connection during frequent arrival of connection requests with QoS requirements.

  20. Model and test in a fungus of the probability that beneficial mutations survive drift.

    Science.gov (United States)

    Gifford, Danna R; de Visser, J Arjan G M; Wahl, Lindi M

    2013-02-23

    Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and theoretical techniques permit the development of models with testable predictions. We developed a new model for the probability of surviving genetic drift, a major component of fixation probability, for novel beneficial mutations in the fungus Aspergillus nidulans, based on the life-history characteristics of its colony growth on a solid surface. We tested the model by measuring the probability of surviving drift in 11 adapted strains introduced into wild-type populations of different densities. We found that the probability of surviving drift increased with mutant invasion fitness, and decreased with wild-type density, as expected. The model accurately predicted the survival probability for the majority of mutants, yielding one of the first direct tests of the extinction probability of beneficial mutations.

  1. Modeling Latin-American stock markets volatility: Varying probabilities and mean reversion in a random level shift model

    Directory of Open Access Journals (Sweden)

    Gabriel Rodríguez

    2016-06-01

    Full Text Available Following Xu and Perron (2014, I applied the extended RLS model to the daily stock market returns of Argentina, Brazil, Chile, Mexico and Peru. This model replaces the constant probability of level shifts for the entire sample with varying probabilities that record periods with extremely negative returns. Furthermore, it incorporates a mean reversion mechanism with which the magnitude and the sign of the level shift component vary in accordance with past level shifts that deviate from the long-term mean. Therefore, four RLS models are estimated: the Basic RLS, the RLS with varying probabilities, the RLS with mean reversion, and a combined RLS model with mean reversion and varying probabilities. The results show that the estimated parameters are highly significant, especially that of the mean reversion model. An analysis of ARFIMA and GARCH models is also performed in the presence of level shifts, which shows that once these shifts are taken into account in the modeling, the long memory characteristics and GARCH effects disappear. Also, I find that the performance prediction of the RLS models is superior to the classic models involving long memory as the ARFIMA(p,d,q models, the GARCH and the FIGARCH models. The evidence indicates that except in rare exceptions, the RLS models (in all its variants are showing the best performance or belong to the 10% of the Model Confidence Set (MCS. On rare occasions the GARCH and the ARFIMA models appear to dominate but they are rare exceptions. When the volatility is measured by the squared returns, the great exception is Argentina where a dominance of GARCH and FIGARCH models is appreciated.

  2. Spin foam models as energetic causal sets

    CERN Document Server

    Cortês, Marina

    2014-01-01

    Energetic causal sets are causal sets endowed by a flow of energy-momentum between causally related events. These incorporate a novel mechanism for the emergence of space-time from causal relations. Here we construct a spin foam model which is also an energetic causal set model. This model is closely related to the model introduced by Wieland, and this construction makes use of results used there. What makes a spin foam model also an energetic causal set is Wieland's identification of new momenta, conserved at events (or four-simplices), whose norms are not mass, but the volume of tetrahedra. This realizes the torsion constraints, which are missing in previous spin foam models, and are needed to relate the connection dynamics to those of the metric, as in general relativity. This identification makes it possible to apply the new mechanism for the emergence of space-time to a spin foam model.

  3. A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures

    Institute of Scientific and Technical Information of China (English)

    李典庆; 张圣坤; 唐文勇

    2003-01-01

    There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.

  4. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    Science.gov (United States)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  5. An Improved Model of Attack Probability Prediction System

    Institute of Scientific and Technical Information of China (English)

    WANG Hui; LIU Shufen; ZHANG Xinjia

    2006-01-01

    This paper presents a novel probability generation algorithm to predict attacks from an insider who exploits known system vulnerabilities through executing authorized operations. It is different from most intrusion detection systems (IDSs) because these IDSs are inefficient to resolve threat from authorized insiders. To deter cracker activities, this paper introduces an improved structure of augmented attack tree and a notion of "minimal attack tree", and proposes a new generation algorithm of minimal attack tree. We can provide a quantitative approach to help system administrators make sound decision.

  6. Rough set models of Physarum machines

    Science.gov (United States)

    Pancerz, Krzysztof; Schumann, Andrew

    2015-04-01

    In this paper, we consider transition system models of behaviour of Physarum machines in terms of rough set theory. A Physarum machine, a biological computing device implemented in the plasmodium of Physarum polycephalum (true slime mould), is a natural transition system. In the behaviour of Physarum machines, one can notice some ambiguity in Physarum motions that influences exact anticipation of states of machines in time. To model this ambiguity, we propose to use rough set models created over transition systems. Rough sets are an appropriate tool to deal with rough (ambiguous, imprecise) concepts in the universe of discourse.

  7. Exponential Bounds for Ruin Probability in Two Moving Average Risk Models with Constant Interest Rate

    Institute of Scientific and Technical Information of China (English)

    Ding Jun YAO; Rong Ming WANG

    2008-01-01

    The authors consider two discrete-time insurance risk models. Two moving average risk models are introduced to model the surplus process, and the probabilities of ruin are examined in models with a constant interest force. Exponential bounds for ruin probabilities of an infinite time horizon are derived by the martingale method.

  8. Ruin probability of the renewal model with risky investment and large claims

    Institute of Scientific and Technical Information of China (English)

    WEI Li

    2009-01-01

    The ruin probability of the renewal risk model with investment strategy for a capital market index is investigated in this paper. For claim sizes with common distribution of extended regular variation, we study the asymptotic behaviour of the ruin probability. As a corollary, we establish a simple asymptotic formula for the ruin probability for the case of Pareto-like claims.

  9. Joint Probability Models of Radiology Images and Clinical Annotations

    Science.gov (United States)

    Arnold, Corey Wells

    2009-01-01

    Radiology data, in the form of images and reports, is growing at a high rate due to the introduction of new imaging modalities, new uses of existing modalities, and the growing importance of objective image information in the diagnosis and treatment of patients. This increase has resulted in an enormous set of image data that is richly annotated…

  10. Joint Probability Models of Radiology Images and Clinical Annotations

    Science.gov (United States)

    Arnold, Corey Wells

    2009-01-01

    Radiology data, in the form of images and reports, is growing at a high rate due to the introduction of new imaging modalities, new uses of existing modalities, and the growing importance of objective image information in the diagnosis and treatment of patients. This increase has resulted in an enormous set of image data that is richly annotated…

  11. The coupon collector urn model with unequal probabilities in ecology and evolution.

    Science.gov (United States)

    Zoroa, N; Lesigne, E; Fernández-Sáez, M J; Zoroa, P; Casas, J

    2017-02-01

    The sequential sampling of populations with unequal probabilities and with replacement in a closed population is a recurrent problem in ecology and evolution. Examples range from biodiversity sampling, epidemiology to the estimation of signal repertoire in animal communication. Many of these questions can be reformulated as urn problems, often as special cases of the coupon collector problem, most simply expressed as the number of coupons that must be collected to have a complete set. We aimed to apply the coupon collector model in a comprehensive manner to one example-hosts (balls) being searched (draws) and parasitized (ball colour change) by parasitic wasps-to evaluate the influence of differences in sampling probabilities between items on collection speed. Based on the model of a complete multinomial process over time, we define the distribution, distribution function, expectation and variance of the number of hosts parasitized after a given time, as well as the inverse problem, estimating the sampling effort. We develop the relationship between the risk distribution on the set of hosts and the speed of parasitization and propose a more elegant proof of the weak stochastic dominance among speeds of parasitization, using the concept of Schur convexity and the 'Robin Hood transfer' numerical operation. Numerical examples are provided and a conjecture about strong dominance-an ordering characteristic of random variables-is proposed. The speed at which new items are discovered is a function of the entire shape of the sampling probability distribution. The sole comparison of values of variances is not sufficient to compare speeds associated with different distributions, as generally assumed in ecological studies. © 2017 The Author(s).

  12. A methodological proposal to research patients’ demands and pre-test probabilities using paper forms in primary care settings

    Directory of Open Access Journals (Sweden)

    Gustavo Diniz Ferreira Gusso

    2013-04-01

    Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.

  13. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional delta

  14. The Probability Distribution Model of Wind Speed over East Malaysia

    Directory of Open Access Journals (Sweden)

    Nurulkamal Masseran

    2013-07-01

    Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.

  15. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities

  16. Modeling the Spectra of Dense Hydrogen Plasmas: Beyond Occupation Probability

    CERN Document Server

    Gomez, T A; Nagayama, T; Kilcrease, D P; Winget, D E

    2016-01-01

    Accurately measuring the masses of white dwarf stars is crucial in many astrophysical contexts (e.g., asteroseismology and cosmochronology). These masses are most commonly determined by fitting a model atmosphere to an observed spectrum; this is known as the spectroscopic method. However, for cases in which more than one method may be employed, there are well known discrepancies between masses determined by the spectroscopic method and those determined by astrometric, dynamical, and/or gravitational-redshift methods. In an effort to resolve these discrepancies, we are developing a new model of hydrogen in a dense plasma that is a significant departure from previous models. Experiments at Sandia National Laboratories are currently underway to validate these new models, and we have begun modifications to incorporate these models into stellar-atmosphere codes.

  17. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models ch...

  18. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  19. Modelling soft error probability in firmware: A case study

    African Journals Online (AJOL)

    A rough and notional schematic of the components involved are supplied in ..... To date, this claim of potential electromagnetic interference is entirely a ... single spike case will illuminate the probabilistic model needed for the bursty case. For.

  20. Modeling the probability of giving birth at health institutions among ...

    African Journals Online (AJOL)

    2014-06-02

    Jun 2, 2014 ... and predict health behaviors in terms of health beliefs. .... Current pregnancy planned (yes*/no). -0.59. 0.49 .... that the last model's overall explanatory strength was .... Health Education: Theory, Research and Practice-3rd.

  1. Aircraft detection based on probability model of structural elements

    Science.gov (United States)

    Chen, Long; Jiang, Zhiguo

    2014-11-01

    Detecting aircrafts is important in the field of remote sensing. In past decades, researchers used various approaches to detect aircrafts based on classifiers for overall aircrafts. However, with the development of high-resolution images, the internal structures of aircrafts should also be taken into consideration now. To address this issue, a novel aircrafts detection method for satellite images based on probabilistic topic model is presented. We model aircrafts as the connected structural elements rather than features. The proposed method contains two major steps: 1) Use Cascade-Adaboost classier to identify the structural elements of aircraft firstly. 2) Connect these structural elements to aircrafts, where the relationships between elements are estimated by hierarchical topic model. The model places strict spatial constraints on structural elements which can identify differences between similar features. The experimental results demonstrate the effectiveness of the approach.

  2. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Directory of Open Access Journals (Sweden)

    Richard R Stein

    2015-07-01

    Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  3. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Science.gov (United States)

    Stein, Richard R; Marks, Debora S; Sander, Chris

    2015-07-01

    Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  4. Probability distribution analysis of observational extreme events and model evaluation

    Science.gov (United States)

    Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.

    2016-12-01

    Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.

  5. Discrete probability models and methods probability on graphs and trees, Markov chains and random fields, entropy and coding

    CERN Document Server

    Brémaud, Pierre

    2017-01-01

    The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .

  6. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  7. The effect of interruption probability in lattice model of two-lane traffic flow with passing

    Science.gov (United States)

    Peng, Guanghan

    2016-11-01

    A new lattice model is proposed by taking into account the interruption probability with passing for two-lane freeway. The effect of interruption probability with passing is investigated about the linear stability condition and the mKdV equation through linear stability analysis and nonlinear analysis, respectively. Furthermore, numerical simulation is carried out to study traffic phenomena resulted from the interruption probability with passing in two-lane system. The results show that the interruption probability with passing can improve the stability of traffic flow for low reaction coefficient while the interruption probability with passing can destroy the stability of traffic flow for high reaction coefficient on two-lane highway.

  8. Finite Time Ruin Probabilities and Large Deviations for Generalized Compound Binomial Risk Models

    Institute of Scientific and Technical Information of China (English)

    Yi Jun HU

    2005-01-01

    In this paper, we extend the classical compound binomial risk model to the case where the premium income process is based on a Poisson process, and is no longer a linear function. For this more realistic risk model, Lundberg type limiting results for the finite time ruin probabilities are derived. Asymptotic behavior of the tail probabilities of the claim surplus process is also investigated.

  9. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  10. Modelling occupants’ heating set-point prefferences

    DEFF Research Database (Denmark)

    Andersen, Rune Vinther; Olesen, Bjarne W.; Toftum, Jørn

    2011-01-01

    consumption. Simultaneous measurement of the set-point of thermostatic radiator valves (trv), and indoor and outdoor environment characteristics was carried out in 15 dwellings in Denmark in 2008. Linear regression was used to infer a model of occupants’ interactions with trvs. This model could easily......Discrepancies between simulated and actual occupant behaviour can offset the actual energy consumption by several orders of magnitude compared to simulation results. Thus, there is a need to set up guidelines to increase the reliability of forecasts of environmental conditions and energy...

  11. Numerical method of slope failure probability based on Bishop model

    Institute of Scientific and Technical Information of China (English)

    SU Yong-hua; ZHAO Ming-hua; ZHANG Yue-ying

    2008-01-01

    Based on Bishop's model and by applying the first and second order mean deviations method, an approximative solution method for the first and second order partial derivatives of functional function was deduced according to numerical analysis theory. After complicated multi-independent variables implicit functional function was simplified to be a single independent variable implicit function and rule of calculating derivative for composite function was combined with principle of the mean deviations method, an approximative solution format of implicit functional function was established through Taylor expansion series and iterative solution approach of reliability degree index was given synchronously. An engineering example was analyzed by the method. The result shows its absolute error is only 0.78% as compared with accurate solution.

  12. Local asymptotic behavior of the survival probability of the equilibrium renewal model with heavy tails

    Institute of Scientific and Technical Information of China (English)

    JIANG; Tao; CHEN; Yiqing

    2005-01-01

    Recently, Tang established a local asymptotic relation for the ruin probability to the Cram(e)r-Lunbderg risk model.In this short note we extend the corresponding result to the equilibrium renewal risk model.

  13. DEVELOPMENT OF A CRASH RISK PROBABILITY MODEL FOR FREEWAYS BASED ON HAZARD PREDICTION INDEX

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2014-12-01

    Full Text Available This study presents a method for the identification of hazardous situations on the freeways. The hazard identification is done using a crash risk probability model. For this study, about 18 km long section of Eastern Freeway in Melbourne (Australia is selected as a test bed. Two categories of data i.e. traffic and accident record data are used for the analysis and modelling. In developing the crash risk probability model, Hazard Prediction Index is formulated in this study by the differences of traffic parameters with threshold values. Seven different prediction indices are examined and the best one is selected as crash risk probability model based on prediction error minimisation.

  14. Ruin probability of the renewal model with risky investment and large claims

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The ruin probability of the renewal risk model with investment strategy for a capital market index is investigated in this paper.For claim sizes with common distribution of extended regular variation,we study the asymptotic behaviour of the ruin probability.As a corollary,we establish a simple asymptotic formula for the ruin probability for the case of Pareto-like claims.

  15. THE TRANSITION PROBABILITY MATRIX OF A MARKOV CHAIN MODEL IN AN ATM NETWORK

    Institute of Scientific and Technical Information of China (English)

    YUE Dequan; ZHANG Huachen; TU Fengsheng

    2003-01-01

    In this paper we consider a Markov chain model in an ATM network, which has been studied by Dag and Stavrakakis. On the basis of the iterative formulas obtained by Dag and Stavrakakis, we obtain the explicit analytical expression of the transition probability matrix. It is very simple to calculate the transition probabilities of the Markov chain by these expressions. In addition, we obtain some results about the structure of the transition probability matrix, which are helpful in numerical calculation and theoretical analysis.

  16. General continuous-time Markov model of sequence evolution via insertions/deletions: are alignment probabilities factorable?

    Science.gov (United States)

    Ezawa, Kiyoshi

    2016-08-11

    Insertions and deletions (indels) account for more nucleotide differences between two related DNA sequences than substitutions do, and thus it is imperative to develop a stochastic evolutionary model that enables us to reliably calculate the probability of the sequence evolution through indel processes. Recently, indel probabilistic models are mostly based on either hidden Markov models (HMMs) or transducer theories, both of which give the indel component of the probability of a given sequence alignment as a product of either probabilities of column-to-column transitions or block-wise contributions along the alignment. However, it is not a priori clear how these models are related with any genuine stochastic evolutionary model, which describes the stochastic evolution of an entire sequence along the time-axis. Moreover, currently none of these models can fully accommodate biologically realistic features, such as overlapping indels, power-law indel-length distributions, and indel rate variation across regions. Here, we theoretically dissect the ab initio calculation of the probability of a given sequence alignment under a genuine stochastic evolutionary model, more specifically, a general continuous-time Markov model of the evolution of an entire sequence via insertions and deletions. Our model is a simple extension of the general "substitution/insertion/deletion (SID) model". Using the operator representation of indels and the technique of time-dependent perturbation theory, we express the ab initio probability as a summation over all alignment-consistent indel histories. Exploiting the equivalence relations between different indel histories, we find a "sufficient and nearly necessary" set of conditions under which the probability can be factorized into the product of an overall factor and the contributions from regions separated by gapless columns of the alignment, thus providing a sort of generalized HMM. The conditions distinguish evolutionary models with

  17. Time series modeling of pathogen-specific disease probabilities with subsampled data.

    Science.gov (United States)

    Fisher, Leigh; Wakefield, Jon; Bauer, Cici; Self, Steve

    2017-03-01

    Many diseases arise due to exposure to one of multiple possible pathogens. We consider the situation in which disease counts are available over time from a study region, along with a measure of clinical disease severity, for example, mild or severe. In addition, we suppose a subset of the cases are lab tested in order to determine the pathogen responsible for disease. In such a context, we focus interest on modeling the probabilities of disease incidence given pathogen type. The time course of these probabilities is of great interest as is the association with time-varying covariates such as meteorological variables. In this set up, a natural Bayesian approach would be based on imputation of the unsampled pathogen information using Markov Chain Monte Carlo but this is computationally challenging. We describe a practical approach to inference that is easy to implement. We use an empirical Bayes procedure in a first step to estimate summary statistics. We then treat these summary statistics as the observed data and develop a Bayesian generalized additive model. We analyze data on hand, foot, and mouth disease (HFMD) in China in which there are two pathogens of primary interest, enterovirus 71 (EV71) and Coxackie A16 (CA16). We find that both EV71 and CA16 are associated with temperature, relative humidity, and wind speed, with reasonably similar functional forms for both pathogens. The important issue of confounding by time is modeled using a penalized B-spline model with a random effects representation. The level of smoothing is addressed by a careful choice of the prior on the tuning variance. © 2016, The International Biometric Society.

  18. Exact probability distribution for the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise

    Science.gov (United States)

    Calisto, H.; Bologna, M.

    2007-05-01

    We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.

  19. Cold and hot cognition: quantum probability theory and realistic psychological modeling.

    Science.gov (United States)

    Corr, Philip J

    2013-06-01

    Typically, human decision making is emotionally "hot" and does not conform to "cold" classical probability (CP) theory. As quantum probability (QP) theory emphasises order, context, superimposition states, and nonlinear dynamic effects, one of its major strengths may be its power to unify formal modeling and realistic psychological theory (e.g., information uncertainty, anxiety, and indecision, as seen in the Prisoner's Dilemma).

  20. The ruin probability of a discrete time risk model under constant interest rate with heavy tails

    NARCIS (Netherlands)

    Tang, Q.

    2004-01-01

    This paper investigates the ultimate ruin probability of a discrete time risk model with a positive constant interest rate. Under the assumption that the gross loss of the company within one year is subexponentially distributed, a simple asymptotic relation for the ruin probability is derived and co

  1. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  2. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  3. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    Science.gov (United States)

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data.

  4. Simulation modeling of the probability of magmatic disruption of the potential Yucca Mountain Site

    Energy Technology Data Exchange (ETDEWEB)

    Crowe, B.M.; Perry, F.V.; Valentine, G.A. [Los Alamos National Lab., NM (United States); Wallmann, P.C.; Kossik, R. [Golder Associates, Inc., Redmond, WA (United States)

    1993-11-01

    The first phase of risk simulation modeling was completed for the probability of magmatic disruption of a potential repository at Yucca Mountain. E1, the recurrence rate of volcanic events, is modeled using bounds from active basaltic volcanic fields and midpoint estimates of E1. The cumulative probability curves for El are generated by simulation modeling using a form of a triangular distribution. The 50% estimates are about 5 to 8 {times} 10{sup 8} events yr{sup {minus}1}. The simulation modeling shows that the cumulative probability distribution for E1 is more sensitive to the probability bounds then the midpoint estimates. The E2 (disruption probability) is modeled through risk simulation using a normal distribution and midpoint estimates from multiple alternative stochastic and structural models. The 50% estimate of E2 is 4.3 {times} 10{sup {minus}3} The probability of magmatic disruption of the potential Yucca Mountain site is 2.5 {times} 10{sup {minus}8} yr{sup {minus}1}. This median estimate decreases to 9.6 {times} 10{sup {minus}9} yr{sup {minus}1} if E1 is modified for the structural models used to define E2. The Repository Integration Program was tested to compare releases of a simulated repository (without volcanic events) to releases from time histories which may include volcanic disruptive events. Results show that the performance modeling can be used for sensitivity studies of volcanic effects.

  5. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  6. Visualizing a Dusty Plasma Shock Wave via Interacting Multiple-Model Mode Probabilities

    OpenAIRE

    Oxtoby, Neil P.; Ralph, Jason F.; Durniak, Céline; Samsonov, Dmitry

    2011-01-01

    Particles in a dusty plasma crystal disturbed by a shock wave are tracked using a three-mode interacting multiple model approach. Color-coded mode probabilities are used to visualize the shock wave propagation through the crystal.

  7. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  8. Towards smart prosthetic hand: Adaptive probability based skeletan muscle fatigue model.

    Science.gov (United States)

    Kumar, Parmod; Sebastian, Anish; Potluri, Chandrasekhar; Urfer, Alex; Naidu, D; Schoen, Marco P

    2010-01-01

    Skeletal muscle force can be estimated using surface electromyographic (sEMG) signals. Usually, the surface location for the sensors is near the respective muscle motor unit points. Skeletal muscles generate a spatial EMG signal, which causes cross talk between different sEMG signal sensors. In this study, an array of three sEMG sensors is used to capture the information of muscle dynamics in terms of sEMG signals. The recorded sEMG signals are filtered utilizing optimized nonlinear Half-Gaussian Bayesian filters parameters, and the muscle force signal using a Chebyshev type-II filter. The filter optimization is accomplished using Genetic Algorithms. Three discrete time state-space muscle fatigue models are obtained using system identification and modal transformation for three sets of sensors for single motor unit. The outputs of these three muscle fatigue models are fused with a probabilistic Kullback Information Criterion (KIC) for model selection. The final fused output is estimated with an adaptive probability of KIC, which provides improved force estimates.

  9. LOGIT and PROBIT Models in the Probability Analysis: Change in the Probability of Death from Celiac Disease Patients

    Directory of Open Access Journals (Sweden)

    Ondřej Šimpach

    2012-12-01

    Full Text Available It is estimated, that in the Czech Republic live about 40 000–50 000 people suffering from celiac disease, which is a disease of gluten intolerance. At the beginning of the independent Czech Republic, the life expectancy at birth of these people was quite low, because just in this period detailed diagnosis of this disease came fromabroad. With an increasing age the probability of death of these people grew faster than that of total population. The aim of this study is to analyse the probability of death of x-year old persons during next five years after the general medical examination in 1990 and 1995. Both analyses will be solved using LOGIT and PROBITmodels and the hypothesis claiming, that probability of death of x-year old person suffering from celiac disease decreased few years after the gaining of new medical knowledge from abroad will be confirmed or refused.

  10. A Discrete SIRS Model with Kicked Loss of Immunity and Infection Probability

    Science.gov (United States)

    Paladini, F.; Renna, I.; Renna, L.

    2011-03-01

    A discrete-time deterministic epidemic model is proposed with the aim of reproducing the behaviour observed in the incidence of real infectious diseases, such as oscillations and irregularities. For this purpose we introduce, in a naïve discrete-time SIRS model, seasonal variability in the loss of immunity and in the infection probability, modelled by sequences of kicks. Restrictive assumptions are made on the parameters of the models, in order to guarantee that the transitions are determined by true probabilities, so that comparisons with stochastic discrete-time previsions can be also provided. Numerical simulations show that the characteristics of real infectious diseases can be adequately modeled.

  11. A Discrete SIRS Model with Kicked Loss of Immunity and Infection Probability

    Energy Technology Data Exchange (ETDEWEB)

    Paladini, F; Renna, L [Dipartimento di Fisica dell' Universita del Salento, 73100 Lecce (Italy); Renna, I, E-mail: luigi.renna@le.infn.it [ISIR, Universite Pierre et Marie Curie/CNRS, F-75005 Paris (France)

    2011-03-01

    A discrete-time deterministic epidemic model is proposed with the aim of reproducing the behaviour observed in the incidence of real infectious diseases, such as oscillations and irregularities. For this purpose we introduce, in a naive discrete-time SIRS model, seasonal variability in the loss of immunity and in the infection probability, modelled by sequences of kicks. Restrictive assumptions are made on the parameters of the models, in order to guarantee that the transitions are determined by true probabilities, so that comparisons with stochastic discrete-time previsions can be also provided. Numerical simulations show that the characteristics of real infectious diseases can be adequately modeled.

  12. Particle filters for random set models

    CERN Document Server

    Ristic, Branko

    2013-01-01

    “Particle Filters for Random Set Models” presents coverage of state estimation of stochastic dynamic systems from noisy measurements, specifically sequential Bayesian estimation and nonlinear or stochastic filtering. The class of solutions presented in this book is based  on the Monte Carlo statistical method. The resulting  algorithms, known as particle filters, in the last decade have become one of the essential tools for stochastic filtering, with applications ranging from  navigation and autonomous vehicles to bio-informatics and finance. While particle filters have been around for more than a decade, the recent theoretical developments of sequential Bayesian estimation in the framework of random set theory have provided new opportunities which are not widely known and are covered in this book. These recent developments have dramatically widened the scope of applications, from single to multiple appearing/disappearing objects, from precise to imprecise measurements and measurement models. This book...

  13. A two-stage approach in solving the state probabilities of the multi-queue M/G/1 model

    Science.gov (United States)

    Chen, Mu-Song; Yen, Hao-Wei

    2016-04-01

    The M/G/1 model is the fundamental basis of the queueing system in many network systems. Usually, the study of the M/G/1 is limited by the assumption of single queue and infinite capacity. In practice, however, these postulations may not be valid, particularly when dealing with many real-world problems. In this paper, a two-stage state-space approach is devoted to solving the state probabilities for the multi-queue finite-capacity M/G/1 model, i.e. q-M/G/1/Ki with Ki buffers in the ith queue. The state probabilities at departure instants are determined by solving a set of state transition equations. Afterward, an embedded Markov chain analysis is applied to derive the state probabilities with another set of state balance equations at arbitrary time instants. The closed forms of the state probabilities are also presented with theorems for reference. Applications of Little's theorem further present the corresponding results for queue lengths and average waiting times. Simulation experiments have demonstrated the correctness of the proposed approaches.

  14. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics.

  15. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    Science.gov (United States)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  16. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  17. Survey on Probability Models of Information Retrieval%信息检索的概率模型

    Institute of Scientific and Technical Information of China (English)

    邢永康; 马少平

    2003-01-01

    The study of mathematical models on information retrieval is an important area in the Information Retrievalcommunity. Because of the uncertainty characteristic of IR,the probability model based on statistical probability is apromising model from recent to future. Those models can be classified into classical models and probability networkmodels. Several famous models are introduced and their shortcomings are pointed out in this paper. We also clarifythe relationship of these models and introduce a new models based on statistical language model curtly.

  18. Probability Models for the Distribution of Copepods in Different Coastal Ecosystems Along the Straits of Malacca

    Science.gov (United States)

    Matias-Peralta, Hazel Monica; Ghodsi, Alireza; Shitan, Mahendran; Yusoff, Fatimah Md.

    Copepods are the most abundant microcrustaceans in the marine waters and are the major food resource for many commercial fish species. In addition, changes in the distribution and population composition of copepods may also serve as an indicator of global climate changes. Therefore, it is important to model the copepod distribution in different ecosystems. Copepod samples were collected from three different ecosystems (seagrass area, cage aquaculture area and coastal waters off shrimp aquaculture farm) along the coastal waters of the Malacca Straits over a one year period. In this study the major statistical analysis consisted of fitting different probability models. This paper highlights the fitting of probability distributions and discusses the adequateness of the fitted models. The usefulness of these fitted models would enable one to make probability statements about the distribution of copepods in three different ecosystems.

  19. Bootstrap imputation with a disease probability model minimized bias from misclassification due to administrative database codes.

    Science.gov (United States)

    van Walraven, Carl

    2017-04-01

    Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    Science.gov (United States)

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  1. Evaluation model for service life of dam based on time-varying risk probability

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    For many dam projects in China, the 50-year designed life time is coming to an end. It is urgent to study the theory and method to evaluate the dam service life. In this paper, firstly, the probability theory of fuzzy event and time-varying effect theory are used to analyze the time-variety of various risk factors in the process of dam operations. A method is proposed to quantify the above time-variety and a model to describe the fuzzy time-varying risk probability for the dam structure is also built. Secondly, the information entropy theory is used to analyze the uncertain degree relationship between the characteristic value of membership function and fuzzy risk probability, and a mathematical method is presented to calculate the time-varying risk probability accordingly. Thirdly, the relation mode between time-varying risk probability and service life is discussed. Based on this relation mode and the acceptable risk probability of dams in China, a method is put forward to evaluate and forecast the dam service life. Finally, the proposed theory and method are used to analyze one concrete dam. The dynamic variability and mutation feature of the dam risk probability are analyzed. The remaining service life of this dam is forecasted. The obtained results can provide technology support for the project management department to make treatment measures of engineering and reasonably arrange reinforce cost. The principles in this paper have wide applicability and can be used in risk analysis for slope instability and other fields.

  2. Uncovering the Best Skill Multimap by Constraining the Error Probabilities of the Gain-Loss Model

    Science.gov (United States)

    Anselmi, Pasquale; Robusto, Egidio; Stefanutti, Luca

    2012-01-01

    The Gain-Loss model is a probabilistic skill multimap model for assessing learning processes. In practical applications, more than one skill multimap could be plausible, while none corresponds to the true one. The article investigates whether constraining the error probabilities is a way of uncovering the best skill assignment among a number of…

  3. Processes models, environmental analyses, and cognitive architectures: quo vadis quantum probability theory?

    Science.gov (United States)

    Marewski, Julian N; Hoffrage, Ulrich

    2013-06-01

    A lot of research in cognition and decision making suffers from a lack of formalism. The quantum probability program could help to improve this situation, but we wonder whether it would provide even more added value if its presumed focus on outcome models were complemented by process models that are, ideally, informed by ecological analyses and integrated into cognitive architectures.

  4. Extension of Some Classical Results on Ruin Probability to Delayed Renewal Model

    Institute of Scientific and Technical Information of China (English)

    Chun Su; Tao Jiang; Qi-he Tang

    2002-01-01

    Embrechts and Veraverbeke[2] investigated the renewal risk model and gave a tail equivalence relationship of the ruin probabilities ψ(x) under the assumption that the claim size is heavy-tailed, which is regarded as a classical result in the context of extremal value theory. In this note we extend this result to the delayed renewal risk model.

  5. Aggregate and Individual Replication Probability within an Explicit Model of the Research Process

    Science.gov (United States)

    Miller, Jeff; Schwarz, Wolf

    2011-01-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…

  6. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    吕渭济; 崔巍

    2001-01-01

    In this paper, two kinds of models are presented and optimized for project investment risk income on the basis of probability X distribution. One kind of model being proved has only a maximal value and another kind being proved has no extreme values.

  7. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    LU Wei-ji; CUI Wei

    2001-01-01

    In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.

  8. Supply Responses of the Unemployed: A Probability Model of Reemployment. Final Report.

    Science.gov (United States)

    Toikka, Richard S.

    The subject of this study is the process by which unemployed workers are reemployed after a layoff. A theoretical model of job search is formulated with asking price based on number of job vacancies, distribution of offers, value placed on nonmarket activity, cost of search activity, and interest rate. A probability model of reemployment is…

  9. Aggregate and Individual Replication Probability within an Explicit Model of the Research Process

    Science.gov (United States)

    Miller, Jeff; Schwarz, Wolf

    2011-01-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…

  10. Uncovering the Best Skill Multimap by Constraining the Error Probabilities of the Gain-Loss Model

    Science.gov (United States)

    Anselmi, Pasquale; Robusto, Egidio; Stefanutti, Luca

    2012-01-01

    The Gain-Loss model is a probabilistic skill multimap model for assessing learning processes. In practical applications, more than one skill multimap could be plausible, while none corresponds to the true one. The article investigates whether constraining the error probabilities is a way of uncovering the best skill assignment among a number of…

  11. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  12. Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices

    Science.gov (United States)

    Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling

    2008-01-01

    The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...

  13. Application of tests of goodness of fit in determining the probability density function for spacing of steel sets in tunnel support system

    Directory of Open Access Journals (Sweden)

    Farnoosh Basaligheh

    2015-12-01

    Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.

  14. LOGIT and PROBIT Models in the Probability Analysis: Change in the Probability of Death from Celiac Disease Patients

    OpenAIRE

    Ondřej Šimpach

    2012-01-01

    It is estimated, that in the Czech Republic live about 40 000–50 000 people suffering from celiac disease, which is a disease of gluten intolerance. At the beginning of the independent Czech Republic, the life expectancy at birth of these people was quite low, because just in this period detailed diagnosis of this disease came fromabroad. With an increasing age the probability of death of these people grew faster than that of total population. The aim of this study is to analyse the probabili...

  15. Translating Climate-Change Probabilities into Impact Risks - Overcoming the Impact- Model Bottleneck

    Science.gov (United States)

    Dettinger, M.

    2008-12-01

    Projections of climate change in response to increasing greenhouse-gas concentrations are uncertain and likely to remain so for the foreseeable future. As more projections become available for analysts, we are increasingly able to characterize the probabilities of obtaining various levels of climate change in current projections. However, the probabilities of most interest in impact assessments are not the probabilities of climate changes, but rather the probabilities (or risks) of various levels and kinds of climate-change impact. These risks can be difficult to estimate even if the climate-change probabilities are well known. The difficulty arises because, frequently, impact models and assessments are computationally demanding or time consuming of hands-on, human expert analyses, so that severe limits are placed on the numbers of climate- change scenarios from which detailed impacts can be assessed. Estimation of risks of various impacts is generally difficult with the few resulting examples. However, real-world examples from the water-resources sector will be used to show that, by applying several different "derived distributions" approaches for estimating the risks of various impacts from known climate-change probabilities to just a few impact-model simulations, risks can be estimated along with indications of how accurate are the impact-risk estimates. The prospects for a priori selection of a few climate-change scenarios (from a larger ensemble of available projections) that will allow the best, most economical estimates of impact risks will be explored with a simple but real-world example.

  16. Efficient evaluation of small failure probability in high-dimensional groundwater contaminant transport modeling via a two-stage Monte Carlo method: FAILURE PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Lin, Guang [Department of Mathematics and School of Mechanical Engineering, Purdue University, West Lafayette Indiana USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2017-03-01

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters. To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.

  17. Transition probabilities of health states for workers in Malaysia using a Markov chain model

    Science.gov (United States)

    Samsuddin, Shamshimah; Ismail, Noriszura

    2017-04-01

    The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.

  18. Jamming transitions and the effect of interruption probability in a lattice traffic flow model with passing

    Science.gov (United States)

    Redhu, Poonam; Gupta, Arvind Kumar

    2015-03-01

    A new lattice hydrodynamic model is proposed by considering the interruption probability effect on traffic flow with passing and analyzed both theoretically and numerically. From linear and non-linear stability analysis, the effect of interruption probability on the phase diagram is investigated and the condition of existence for kink-antikink soliton solution of mKdV equation is derived. The stable region is enhanced with interruption probability and the jamming transition occurs from uniform flow to kink flow through chaotic flow for higher and intermediate values of non-interruption effect of passing. It is also observed that there exists conventional jamming transition between uniform flow and kink flow for lower values of non-interruption effect of passing. Numerical simulations are carried out and found in accordance with the theoretical findings which confirm that the effect of interruption probability plays an important role in stabilizing traffic flow when passing is allowed.

  19. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  20. SVM model for estimating the parameters of the probability-integral method of predicting mining subsidence

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hua; WANG Yun-jia; LI Yong-feng

    2009-01-01

    A new mathematical model to estimate the parameters of the probability-integral method for mining subsidence prediction is proposed. Based on least squares support vector machine (LS-SVM) theory, it is capable of improving the precision and reliability of mining subsidence prediction. Many of the geological and mining factors involved are related in a nonlinear way. The new model is based on statistical theory (SLT) and empirical risk minimization (ERM) principles. Typical data collected from observation stations were used for the learning and training samples. The calculated results from the LS-SVM model were compared with the prediction results of a back propagation neural network (BPNN) model. The results show that the parameters were more precisely predicted by the LS-SVM model than by the BPNN model. The LS-SVM model was faster in computation and had better generalized performance. It provides a highly effective method for calculating the predicting parameters of the probability-integral method.

  1. A new car-following model with consideration of the traffic interruption probability

    Institute of Scientific and Technical Information of China (English)

    Tang Tie-Qiao; Huang Hai-Jun; Wong S. C.; Jiang Rui

    2009-01-01

    In this paper, we present a new car-following model by taking into account the effects of the traffic interruption probability on the car-following behaviour of the following vehicle. The stability condition of the model is obtained by using the linear stability theory. The modified Korteweg-de Vries (KdV) equation is constructed and solved, and three types of traffic flows in the headway sensitivity space-stable, metastable, and unstable-are classified. Both the analytical and simulation results show that the traffic interruption probability indeed has an influence on driving behaviour, and the consideration of traffic interruption probability in the car-following model could stabilize traffic flow.

  2. Assessing potential modifications of landslide triggering probability based on hydromechanical modelling and regional climate model projections

    Science.gov (United States)

    Peres, David Johnny; Cancelliere, Antonino

    2017-04-01

    Climate change related to uncontrolled greenhouse gas emissions is expected to modify climate characteristics in a harmful way, increasing the frequency of many precipitation-triggered natural hazards, landslides included. In our study we analyse regional climate model (RCM) projections with the aim of assessing the potential future modifications of rainfall event characteristics linked to shallow landslide triggering, such as: event duration, total depth, and inter-arrival time. Factor of changes of the mean and the variance of these rainfall-event characteristics are exploited to adjust a stochastic rainfall generator aimed at simulating precipitation series likely to occur in the future. Then Monte Carlo simulations - where the stochastic rainfall generator and a physically based hydromechanical model are coupled - are carried out to estimate the probability of landslide triggering for future time horizons, and its changes respect to the current climate conditions. The proposed methodology is applied to the Peloritani region in Sicily, Italy, an area that in the past two decades has experienced several catastrophic shallow and rapidly moving landslide events. Different RCM simulations from the Coordinated regional Climate Downscaling Experiment (CORDEX) initiative are considered in the application, as well as two different emission scenarios, known as Representative Concentration Pathways: intermediate (RCP 4.5) and high-emissions (RCP 8.5). The estimated rainfall event characteristics modifications differ significantly both in magnitude and in direction (increase/decrease) from one model to another. RCMs are concordant only in predicting an increase of the mean of inter-event dry intervals. The variance of rainfall depth exhibits maximum changes (increase or decrease depending on the RCM), and it is the characteristic to which landslide triggering seems to be more sensitive. Some RCMs indicate significant variations of landslide probability due to climate

  3. Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation

    Directory of Open Access Journals (Sweden)

    Michal Halas

    2012-01-01

    Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.

  4. A generic probability based model to derive regional patterns of crops in time and space

    Science.gov (United States)

    Wattenbach, Martin; Luedtke, Stefan; Redweik, Richard; van Oijen, Marcel; Balkovic, Juraj; Reinds, Gert Jan

    2015-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy portioning, they influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. The method described here is designed to predict the most probable crop to appear at a given location and time. The method uses statistical crop area information on NUTS2 level from EUROSTAT and the Common Agricultural Policy Regionalized Impact Model (CAPRI) as observation. These crops are then spatially disaggregated to the 1 x 1 km grid scale within the region, using the assumption that the probability of a crop appearing at a given location and a given year depends on a) the suitability of the land for the cultivation of the crop derived from the MARS Crop Yield Forecast System (MCYFS) and b) expert knowledge of agricultural practices. The latter includes knowledge concerning the feasibility of one crop following another (e.g. a late-maturing crop might leave too little time for the establishment of a winter cereal crop) and the need to combat weed infestations or crop diseases. The model is implemented in R and PostGIS. The quality of the generated crop sequences per grid cell is evaluated on the basis of the given statistics reported by the joint EU/CAPRI database. The assessment is given on NUTS2 level using per cent bias as a measure with a threshold of 15% as minimum quality. The results clearly indicates that crops with a large relative share within the administrative unit are not as error prone as crops that allocate only minor parts of the unit. However, still roughly 40% show an absolute per cent bias above the 15% threshold. This

  5. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  6. Predictive modeling of subsurface shoreline oil encounter probability from the Exxon Valdez oil spill in Prince William Sound, Alaska.

    Science.gov (United States)

    Nixon, Zachary; Michel, Jacqueline

    2015-04-07

    To better understand the distribution of remaining lingering subsurface oil residues from the Exxon Valdez oil spill (EVOS) along the shorelines of Prince William Sound (PWS), AK, we revised previous modeling efforts to allow spatially explicit predictions of the distribution of subsurface oil. We used a set of pooled field data and predictor variables stored as Geographic Information Systems (GIS) data to generate calibrated boosted tree models predicting the encounter probability of different categories of subsurface oil. The models demonstrated excellent predictive performance as evaluated by cross-validated performance statistics. While the average encounter probabilities at most shoreline locations are low across western PWS, clusters of shoreline locations with elevated encounter probabilities remain in the northern parts of the PWS, as well as more isolated locations. These results can be applied to estimate the location and amount of remaining oil, evaluate potential ongoing impacts, and guide remediation. This is the first application of quantitative machine-learning based modeling techniques in estimating the likelihood of ongoing, long-term shoreline oil persistence after a major oil spill.

  7. Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique

    Energy Technology Data Exchange (ETDEWEB)

    Glosup, J.G.; Axelrod M.C. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of our interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method.

  8. Ruin Probability and Joint Distributions of Some Actuarial Random Vectors in the Compound Pascal Model

    Institute of Scientific and Technical Information of China (English)

    Xian-min Geng; Shu-chen Wan

    2011-01-01

    The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n<T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.

  9. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  10. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  11. Strong Lensing Probabilities in a Cosmological Model with a Running Primordial Power Spectrum

    CERN Document Server

    Zhang, T J; Yang, Z L; He, X T; Zhang, Tong-Jie; Chen, Da-Ming; Yang, Zhi-Liang; He, Xiang-Tao

    2004-01-01

    The combination of the first-year Wilkinson Microwave Anisotropy Probe (WMAP) data with other finer scale cosmic microwave background (CMB) experiments (CBI and ACBAR) and two structure formation measurements (2dFGRS and Lyman $\\alpha$ forest) suggest a $\\Lambda$CDM cosmological model with a running spectral power index of primordial density fluctuations. Motivated by this new result on the index of primordial power spectrum, we present the first study on the predicted lensing probabilities of image separation in a spatially flat $\\Lambda$CDM model with a running spectral index (RSI-$\\Lambda$CDM model). It is shown that the RSI-$\\Lambda$CDM model suppress the predicted lensing probabilities on small splitting angles of less than about 4$^{''}$ compared with that of standard power-law $\\Lambda$CDM (PL-$\\Lambda$CDM) model.

  12. A semiclassical model for the calculation of nonadiabatic transition probabilities for classically forbidden transitions.

    Science.gov (United States)

    Dang, Phuong-Thanh; Herman, Michael F

    2009-02-01

    A semiclassical surface hopping model is presented for the calculation of nonadiabatic transition probabilities for the case in which the avoided crossing point is in the classically forbidden regions. The exact potentials and coupling are replaced with simple functional forms that are fitted to the values, evaluated at the turning point in the classical motion, of the Born-Oppenheimer potentials, the nonadiabatic coupling, and their first few derivatives. For the one-dimensional model considered, reasonably accurate results for transition probabilities are obtained down to around 10(-10). The possible extension of this model to many dimensional problems is discussed. The fact that the model requires only information at the turning point, a point that the trajectories encounter would be a significant advantage in many dimensional problems over Landau-Zener type models, which require information at the avoided crossing seam, which is in the forbidden region where the trajectories do not go.

  13. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  14. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    Science.gov (United States)

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  15. Modelling the regional variability of the probability of high trihalomethane occurrence in municipal drinking water.

    Science.gov (United States)

    Cool, Geneviève; Lebel, Alexandre; Sadiq, Rehan; Rodriguez, Manuel J

    2015-12-01

    The regional variability of the probability of occurrence of high total trihalomethane (TTHM) levels was assessed using multilevel logistic regression models that incorporate environmental and infrastructure characteristics. The models were structured in a three-level hierarchical configuration: samples (first level), drinking water utilities (DWUs, second level) and natural regions, an ecological hierarchical division from the Quebec ecological framework of reference (third level). They considered six independent variables: precipitation, temperature, source type, seasons, treatment type and pH. The average probability of TTHM concentrations exceeding the targeted threshold was 18.1%. The probability was influenced by seasons, treatment type, precipitations and temperature. The variance at all levels was significant, showing that the probability of TTHM concentrations exceeding the threshold is most likely to be similar if located within the same DWU and within the same natural region. However, most of the variance initially attributed to natural regions was explained by treatment types and clarified by spatial aggregation on treatment types. Nevertheless, even after controlling for treatment type, there was still significant regional variability of the probability of TTHM concentrations exceeding the threshold. Regional variability was particularly important for DWUs using chlorination alone since they lack the appropriate treatment required to reduce the amount of natural organic matter (NOM) in source water prior to disinfection. Results presented herein could be of interest to authorities in identifying regions with specific needs regarding drinking water quality and for epidemiological studies identifying geographical variations in population exposure to disinfection by-products (DBPs).

  16. Use of several immunological markers to model the probability of active tuberculosis.

    Science.gov (United States)

    Petruccioli, Elisa; Navarra, Assunta; Petrone, Linda; Vanini, Valentina; Cuzzi, Gilda; Gualano, Gina; Palmieri, Fabrizio; Girardi, Enrico; Goletti, Delia

    2016-10-01

    Blood-based biomarkers tests are attractive alternative for diagnosing tuberculosis to assays depending on mycobacteria detection. Given several immunological markers we used logistic regression to model the probability of active tuberculosis in a cohort of patients with active or latent tuberculosis, showing an increased accuracy in distinguishing active from latent tuberculosis.

  17. On new cautious structural reliability models in the framework of imprecise probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev; Kozine, Igor

    2010-01-01

    New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...... measures when the number of events of interest or observations is very small. The main feature of the models is that prior ignorance is not modelled by a fixed single prior distribution, but by a class of priors which is defined by upper and lower probabilities that can converge as statistical data...

  18. Model of constant probability event and its application in information fusion

    Institute of Scientific and Technical Information of China (English)

    邓勇; 施文康

    2004-01-01

    A model of constant probability event is constructed rigorously in event space of PSCEA. It is showed that the numerical-based fusion and the algebraic-based fusion have a consistent result when the weight is regarded as a constant probability event. From the point of view of algebra, we present a novel similarity measure in product space. Based on the similarity degree, we use a similarity aggregation method to fusion experts' evaluation. We also give a numerical example to illustrate the method.

  19. Uncertainty and Probability in Natural Hazard Assessment and Their Role in the Testability of Hazard Models

    Science.gov (United States)

    Marzocchi, Warner; Jordan, Thomas

    2014-05-01

    Probabilistic assessment has become a widely accepted procedure to estimate quantitatively natural hazards. In essence probabilities are meant to quantify the ubiquitous and deep uncertainties that characterize the evolution of natural systems. However, notwithstanding the very wide use of the terms 'uncertainty' and 'probability' in natural hazards, the way in which they are linked, how they are estimated and their scientific meaning are far from being clear, as testified by the last Intergovernmental Panel on Climate Change (IPCC) report and by its subsequent review. The lack of a formal framework to interpret uncertainty and probability coherently has paved the way for some of the strongest critics of hazard analysis; in fact, it has been argued that most of natural hazard analyses are intrinsically 'unscientific'. For example, among the concerns is the use of expert opinion to characterize the so-called epistemic uncertainties; many have argued that such personal degrees of belief cannot be measured and, by implication, cannot be tested. The purpose of this talk is to confront and clarify the conceptual issues associated with the role of uncertainty and probability in natural hazard analysis and the conditions that make a hazard model testable and then 'scientific'. Specifically, we show that testability of hazard models requires a suitable taxonomy of uncertainty embedded in a proper logical framework. This taxonomy of uncertainty is composed by aleatory variability, epistemic uncertainty, and ontological error. We discuss their differences, the link with the probability, and their estimation using data, models, and subjective expert opinion. We show that these different uncertainties, and the testability of hazard models, can be unequivocally defined only for a well-defined experimental concept that is a concept external to the model under test. All these discussions are illustrated through simple examples related to the probabilistic seismic hazard analysis.

  20. Evaluation model for service life of dam based on time-varying risk probability

    Institute of Scientific and Technical Information of China (English)

    SU HuaiZhi; WEN ZhiPing; HU Jiang; WU ZhongRu

    2009-01-01

    For many dam projects in China, the 50-year designed life time is coming to an end. It is urgent to study the theory and method to evaluate the dam service life. In this paper, firstly, the probability theory of fuzzy event and time-varying effect theory are used to analyze the time-variety of various risk factors in the process of dam operations. A method is proposed to quantify the above time-variety and a model to describe the fuzzy time-varying risk probability for the dam structure is also built. Secondly, the information entropy theory is used to analyze the uncertain degree relationship between the characteristic value of membership function and fuzzy risk probability, and a mathematical method is presented to calculate the time-varying risk probability accordingly. Thirdly, the relation mode between time-varying risk probability and service life is discussed. Based on this relation mode and the acceptable risk probability of dams in China, a method is put forward to evaluate and forecast the dam service life.Finally, the proposed theory and method are used to analyze one concrete dam. The dynamic variability and mutation feature of the dam risk probability are analyzed. The remaining service life of this dam is forecasted. The obtained results can provide technology support for the project management department to make treatment measures of engineering and reasonably arrange reinforce cost. The principles in this paper have wide applicability and can be used in risk analysis for slope instability and other fields.

  1. Modeling summer month hydrological drought probabilities in the United States using antecedent flow conditions

    Science.gov (United States)

    Austin, Samuel H.; Nelms, David L.

    2017-01-01

    Climate change raises concern that risks of hydrological drought may be increasing. We estimate hydrological drought probabilities for rivers and streams in the United States (U.S.) using maximum likelihood logistic regression (MLLR). Streamflow data from winter months are used to estimate the chance of hydrological drought during summer months. Daily streamflow data collected from 9,144 stream gages from January 1, 1884 through January 9, 2014 provide hydrological drought streamflow probabilities for July, August, and September as functions of streamflows during October, November, December, January, and February, estimating outcomes 5-11 months ahead of their occurrence. Few drought prediction methods exploit temporal links among streamflows. We find MLLR modeling of drought streamflow probabilities exploits the explanatory power of temporally linked water flows. MLLR models with strong correct classification rates were produced for streams throughout the U.S. One ad hoc test of correct prediction rates of September 2013 hydrological droughts exceeded 90% correct classification. Some of the best-performing models coincide with areas of high concern including the West, the Midwest, Texas, the Southeast, and the Mid-Atlantic. Using hydrological drought MLLR probability estimates in a water management context can inform understanding of drought streamflow conditions, provide warning of future drought conditions, and aid water management decision making.

  2. Computational modeling of the effects of amyloid-beta on release probability at hippocampal synapses

    Directory of Open Access Journals (Sweden)

    Armando eRomani

    2013-01-01

    Full Text Available The role of amyloid-beta (Aβ in brain function and in the pathogenesis of Alzheimer’s disease remains elusive. Recent publications reported that an increase in Aβ concentration perturbs pre-synaptic release in hippocampal neurons. In particular, it was shown in vitro that Aβ is an endogenous regulator of synaptic transmission at the CA3-CA1 synapse, enhancing its release probability. How this synaptic modulator influences neuronal output during physiological stimulation patterns, such as those elicited in vivo, is still unknown. Using a realistic model of hippocampal CA1 pyramidal neurons, we first implemented this Aβ-induced enhancement of release probability and validated the model by reproducing the experimental findings. We then demonstrated that this synaptic modification can significantly alter synaptic integration properties in a wide range of physiologically relevant input frequencies (from 5 to 200 Hz. Finally, we used natural input patterns, obtained from CA3 pyramidal neurons in vivo during free exploration of rats in an open field, to investigate the effects of enhanced Aβ on synaptic release under physiological conditions. The model shows that the CA1 neuronal response to these natural patterns is altered in the increased-Aβ condition, especially for frequencies in the theta and gamma ranges. These results suggest that the perturbation of release probability induced by increased Aβ can significantly alter the spike probability of CA1 pyramidal neurons and thus contribute to abnormal hippocampal function during Alzheimer’s disease.

  3. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Francisco, Alexandre S.; Duran, Jorge Alberto R., E-mail: afrancisco@metal.eeimvr.uff.br, E-mail: duran@metal.eeimvr.uff.br [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Dept. de Engenharia Mecanica

    2013-07-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  4. Statistic inversion of multi-zone transition probability models for aquifer characterization in alluvial fans

    CERN Document Server

    Zhu, Lin; Gong, Huili; Gable, Carl; Teatini, Pietro

    2015-01-01

    Understanding the heterogeneity arising from the complex architecture of sedimentary sequences in alluvial fans is challenging. This paper develops a statistical inverse framework in a multi-zone transition probability approach for characterizing the heterogeneity in alluvial fans. An analytical solution of the transition probability matrix is used to define the statistical relationships among different hydrofacies and their mean lengths, integral scales, and volumetric proportions. A statistical inversion is conducted to identify the multi-zone transition probability models and estimate the optimal statistical parameters using the modified Gauss-Newton-Levenberg-Marquardt method. The Jacobian matrix is computed by the sensitivity equation method, which results in an accurate inverse solution with quantification of parameter uncertainty. We use the Chaobai River alluvial fan in the Beijing Plain, China, as an example for elucidating the methodology of alluvial fan characterization. The alluvial fan is divided...

  5. A Statistical Model for the Prediction of Wind-Speed Probabilities in the Atmospheric Surface Layer

    Science.gov (United States)

    Efthimiou, G. C.; Hertwig, D.; Andronopoulos, S.; Bartzis, J. G.; Coceal, O.

    2016-11-01

    Wind fields in the atmospheric surface layer (ASL) are highly three-dimensional and characterized by strong spatial and temporal variability. For various applications such as wind-comfort assessments and structural design, an understanding of potentially hazardous wind extremes is important. Statistical models are designed to facilitate conclusions about the occurrence probability of wind speeds based on the knowledge of low-order flow statistics. Being particularly interested in the upper tail regions we show that the statistical behaviour of near-surface wind speeds is adequately represented by the Beta distribution. By using the properties of the Beta probability density function in combination with a model for estimating extreme values based on readily available turbulence statistics, it is demonstrated that this novel modelling approach reliably predicts the upper margins of encountered wind speeds. The model's basic parameter is derived from three substantially different calibrating datasets of flow in the ASL originating from boundary-layer wind-tunnel measurements and direct numerical simulation. Evaluating the model based on independent field observations of near-surface wind speeds shows a high level of agreement between the statistically modelled horizontal wind speeds and measurements. The results show that, based on knowledge of only a few simple flow statistics (mean wind speed, wind-speed fluctuations and integral time scales), the occurrence probability of velocity magnitudes at arbitrary flow locations in the ASL can be estimated with a high degree of confidence.

  6. A Joint Specification Test for Response Probabilities in Unordered Multinomial Choice Models

    Directory of Open Access Journals (Sweden)

    Masamune Iwasawa

    2015-09-01

    Full Text Available Estimation results obtained by parametric models may be seriously misleading when the model is misspecified or poorly approximates the true model. This study proposes a test that jointly tests the specifications of multiple response probabilities in unordered multinomial choice models. The test statistic is asymptotically chi-square distributed, consistent against a fixed alternative and able to detect a local alternative approaching to the null at a rate slower than the parametric rate. We show that rejection regions can be calculated by a simple parametric bootstrap procedure, when the sample size is small. The size and power of the tests are investigated by Monte Carlo experiments.

  7. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  8. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY); Hajagos, Janos (Applied Biomathematics, Setauket, NY); Nelsen, Roger B. (Lewis & Clark College, Portland, OR)

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  9. Forecasting with the Fokker-Planck model: Bayesian setting of parameter

    Science.gov (United States)

    Montagnon, Chris

    2017-04-01

    Using a closed solution to a Fokker-Planck model of a time series, a probability distribution for the next point in the time series is developed. This probability distribution has one free parameter. Various Bayesian approaches to setting this parameter are tested by forecasting some real world time series. Results show a more than 25 % reduction in the ' 95 % point' of the probability distribution (the safety stock required in these real world situations), versus the conventional ARMA approach, without a significant increase in actuals exceeding this level.

  10. Benchmark data set for wheat growth models

    DEFF Research Database (Denmark)

    Asseng, S; Ewert, F.; Martre, P;

    2015-01-01

    The data set includes a current representative management treatment from detailed, quality-tested sentinel field experiments with wheat from four contrasting environments including Australia, The Netherlands, India and Argentina. Measurements include local daily climate data (solar radiation, max...

  11. Failure Probability Model considering the Effect of Intermediate Principal Stress on Rock Strength

    Directory of Open Access Journals (Sweden)

    Yonglai Zheng

    2015-01-01

    Full Text Available A failure probability model is developed to describe the effect of the intermediate principal stress on rock strength. Each shear plane in rock samples is considered as a micro-unit. The strengths of these micro-units are assumed to match Weibull distribution. The macro strength of rock sample is a synthetic consideration of all directions’ probabilities. New model reproduces the typical phenomenon of intermediate principal stress effect that occurs in some true triaxial experiments. Based on the new model, a strength criterion is proposed and it can be regarded as a modified Mohr-Coulomb criterion with a uniformity coefficient. New strength criterion can quantitatively reflect the intermediate principal stress effect on rock strength and matches previously published experimental results better than common strength criteria.

  12. An Imprecise Probability Model for Structural Reliability Based on Evidence and Gray Theory

    Directory of Open Access Journals (Sweden)

    Bin Suo

    2013-01-01

    Full Text Available To avoid the shortages and limitations of probabilistic and non-probabilistic reliability model for structural reliability analysis in the case of limited samples for basic variables, a new imprecise probability model is proposed. Confidence interval with a given confidence is calculated on the basis of small samples by gray theory, which is not depending on the distribution pattern of variable. Then basic probability assignments and focal elements are constructed and approximation methods of structural reliability based on belief and plausibility functions are proposed in the situation that structure limit state function is monotonic and non-monotonic, respectively. The numerical examples show that the new reliability model utilizes all the information included in small samples and considers both aleatory and epistemic uncertainties in them, thus it can rationally measure the safety of the structure and the measurement can be more and more accurate with the increasing of sample size.

  13. MEASURING MODEL FOR BAD LOANS IN BANKS. THE DEFAULT PROBABILITY MODEL.

    Directory of Open Access Journals (Sweden)

    SOCOL ADELA

    2010-12-01

    Full Text Available The banking sectors of the transition countries have progressed remarkably in the last 20 years. In fact, banking in most transition countries has largely shaken off the traumas of the transition eraAt the start of the 21st century banks in these countries look very much like banks elsewhere. That is, they are by no means problem free but they are struggling with the same issues as banks in other emerging market countries during the financial crises conditions. The institutional environment differs considerably among the countries. The goal we set with this article is to examine in terms of methodology the most important assessment criteria of a measuring model for bad loans.

  14. Empirical probability model of the cold plasma environment in Jovian inner magnetosphere

    CERN Document Server

    Futaana, Yoshifumi; Roussos, Elias; Trouscott, Pete; Heynderickx, Daniel; Cipriani, Fabrice; Rodgers, David

    2016-01-01

    A new empirical, analytical model of cold plasma (< 10 keV) in the Jovian inner magnetosphere is constructed. Plasmas in this energy range impact surface charging. A new feature of this model is predicting each plasma parameter for a specified probability (percentile). The new model was produced as follows. We start from a reference model for each plasma parameter, which was scaled to fit the data of Galileo plasma spectrometer. The scaled model was then represented as a function of radial distance, magnetic local time, and magnetic latitude, presumably describing the mean states. Then, the deviation of the observed values from the model were attribute to the variability in the environment, which was accounted for by the percentile at a given location.The input parameters for this model are the spacecraft position and the percentile. The model is inteded to be used for the JUICE mission analysis.

  15. Hitchhikers on trade routes: A phenology model estimates the probabilities of gypsy moth introduction and establishment.

    Science.gov (United States)

    Gray, David R

    2010-12-01

    As global trade increases so too does the probability of introduction of alien species to new locations. Estimating the probability of an alien species introduction and establishment following introduction is a necessary step in risk estimation (probability of an event times the consequences, in the currency of choice, of the event should it occur); risk estimation is a valuable tool for reducing the risk of biological invasion with limited resources. The Asian gypsy moth, Lymantria dispar (L.), is a pest species whose consequence of introduction and establishment in North America and New Zealand warrants over US$2 million per year in surveillance expenditure. This work describes the development of a two-dimensional phenology model (GLS-2d) that simulates insect development from source to destination and estimates: (1) the probability of introduction from the proportion of the source population that would achieve the next developmental stage at the destination and (2) the probability of establishment from the proportion of the introduced population that survives until a stable life cycle is reached at the destination. The effect of shipping schedule on the probabilities of introduction and establishment was examined by varying the departure date from 1 January to 25 December by weekly increments. The effect of port efficiency was examined by varying the length of time that invasion vectors (shipping containers and ship) were available for infection. The application of GLS-2d is demonstrated using three common marine trade routes (to Auckland, New Zealand, from Kobe, Japan, and to Vancouver, Canada, from Kobe and from Vladivostok, Russia).

  16. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  17. Joint probability density function modeling of velocity and scalar in turbulence with unstructured grids

    CERN Document Server

    Bakosi, J; Boybeyi, Z

    2010-01-01

    In probability density function (PDF) methods a transport equation is solved numerically to compute the time and space dependent probability distribution of several flow variables in a turbulent flow. The joint PDF of the velocity components contains information on all one-point one-time statistics of the turbulent velocity field, including the mean, the Reynolds stresses and higher-order statistics. We developed a series of numerical algorithms to model the joint PDF of turbulent velocity, frequency and scalar compositions for high-Reynolds-number incompressible flows in complex geometries using unstructured grids. Advection, viscous diffusion and chemical reaction appear in closed form in the PDF formulation, thus require no closure hypotheses. The generalized Langevin model (GLM) is combined with an elliptic relaxation technique to represent the non-local effect of walls on the pressure redistribution and anisotropic dissipation of turbulent kinetic energy. The governing system of equations is solved fully...

  18. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    Science.gov (United States)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We

  19. Setting Parameters for Biological Models With ANIMO

    NARCIS (Netherlands)

    Schivo, Stefano; Scholma, Jetse; Karperien, Hermanus Bernardus Johannes; Post, Janine Nicole; van de Pol, Jan Cornelis; Langerak, Romanus; André, Étienne; Frehse, Goran

    2014-01-01

    ANIMO (Analysis of Networks with Interactive MOdeling) is a software for modeling biological networks, such as e.g. signaling, metabolic or gene networks. An ANIMO model is essentially the sum of a network topology and a number of interaction parameters. The topology describes the interactions

  20. Time Series Modeling of Pathogen-Specific Disease Probabilities with Subsampled Data

    OpenAIRE

    Fisher, Leigh; Wakefield, Jon; Bauer, Cici; Self, Steve

    2016-01-01

    Many diseases arise due to exposure to one of multiple possible pathogens. We consider the situation in which disease counts are available over time from a study region, along with a measure of clinical disease severity, for example, mild or severe. In addition, we suppose a subset of the cases are lab tested in order to determine the pathogen responsible for disease. In such a context, we focus interest on modeling the probabilities of disease incidence given pathogen type....

  1. An Understanding of Probability Models%对概率模型的认识

    Institute of Scientific and Technical Information of China (English)

    林诒勋

    2007-01-01

    对概率论基础历来有多元的认识,历史上不同观点的争论(如客观与主观)持续不断.如今Kolmogorov 公理模型在数学中被广泛接受.但是它只是说:"概率"是一个正则化测度,恰如长度、体积、质量一样,并且与随机现象无关.事实上,概率和任意正则化的物理测度都满足这个公理系统.所以它不是概率概念的精确定义.本文尝试给出另一个模型,它可缩小Kolmogorov模型与直观背景之间的间隙,统一历史上的各种定义,并给一些争论问题以合理的解释.%There are pluralistic understandings on the foundation of probability theory and the disputes between different viewpoints (such as objective and subjective) have gone on for a long time in history. Nowadays, the Kolmogorov axiom model is widely accepted in mathematics. However, this model only says that "probability" is a normalized measure, just like length, volume, mass, etc, and irrelevant to random phenomena. In fact, the probability as well as any normalized physical measure fulfills the same axiom system. So it is not a precise definition of probability. Another model is presented, which reduces the gap between the Kolmogorov model and the intuitive background, unifies different definitions in history, and gives reasonable explanations to some dispute issues.

  2. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials....... In several previous works where the PDF was expressed this way, further use was hampered by the fact that the PDF takes negative values for a range of velocities. This problem is overcome in the present formulation...

  3. Mapping regional forest fire probability using artificial neural network model in a Mediterranean forest ecosystem

    Directory of Open Access Journals (Sweden)

    Onur Satir

    2016-09-01

    Full Text Available Forest fires are one of the most important factors in environmental risk assessment and it is the main cause of forest destruction in the Mediterranean region. Forestlands have a number of known benefits such as decreasing soil erosion, containing wild life habitats, etc. Additionally, forests are also important player in carbon cycle and decreasing the climate change impacts. This paper discusses forest fire probability mapping of a Mediterranean forestland using a multiple data assessment technique. An artificial neural network (ANN method was used to map forest fire probability in Upper Seyhan Basin (USB in Turkey. Multi-layer perceptron (MLP approach based on back propagation algorithm was applied in respect to physical, anthropogenic, climate and fire occurrence datasets. Result was validated using relative operating characteristic (ROC analysis. Coefficient of accuracy of the MLP was 0.83. Landscape features input to the model were assessed statistically to identify the most descriptive factors on forest fire probability mapping using the Pearson correlation coefficient. Landscape features like elevation (R = −0.43, tree cover (R = 0.93 and temperature (R = 0.42 were strongly correlated with forest fire probability in the USB region.

  4. Detection probability models for bacteria, and how to obtain them from heterogeneous spiking data. An application to Bacillus anthracis.

    Science.gov (United States)

    Hedell, Ronny; Stephansson, Olga; Mostad, Petter; Andersson, Mats Gunnar

    2017-01-16

    Efficient and correct evaluation of sampling results with respect to hypotheses about the concentration or distribution of bacteria generally requires knowledge about the performance of the detection method. To assess the sensitivity of the detection method an experiment is usually performed where the target matrix is spiked (i.e. artificially contaminated) with different concentrations of the bacteria, followed by analyses of the samples using the pre-enrichment method and the analytical detection method of interest. For safety reasons or because of economic or time limits it is not always possible to perform exactly such an experiment, with the desired number of samples. In this paper, we show how heterogeneous data from diverse sources may be combined within a single model to obtain not only estimates of detection probabilities, but also, crucially, uncertainty estimates. We indicate how such results can then be used to obtain optimal conclusions about presence of bacteria, and illustrate how strongly the sampling results speak in favour of or against contamination. In our example, we consider the case when B. cereus is used as surrogate for B. anthracis, for safety reasons. The statistical modelling of the detection probabilities and of the growth characteristics of the bacteria types is based on data from four experiments where different matrices of food were spiked with B. anthracis or B. cereus and analysed using plate counts and qPCR. We show how flexible and complex Bayesian models, together with inference tools such as OpenBUGS, can be used to merge information about detection probability curves. Two different modelling approaches, differing in whether the pre-enrichment step and the PCR detection step are modelled separately or together, are applied. The relative importance on the detection curves for various existing data sets are evaluated and illustrated.

  5. Improvement and extension of the generalized hard-sphere reaction probability model.

    Science.gov (United States)

    Schübler, M A; Petkow, D; Herdrich, G

    2012-04-01

    The GHS (Generalized Hard Sphere)-based standard reaction probability model commonly used in probabilistic particle methods is evaluated. We show that the original model has no general validity with respect to the molecular reaction. Mathematical consistency exists only for reactions with vanishing activation energy. For small energies close to the activation threshold the individual reaction probability for the special case of associative ionization of atomic nitrogen diverges. This makes the model extremely expensive, and nonphysical. An improved model is derived, and its implementation is verified on basis of the aforementioned reaction. Both models converge to the same value at large energies. The relative error of the original model with respect to the new model is independent of the particle pairing and, hence, of the reaction type. The error is smaller than 1% for collision energies in excess of 200 times the activation energy. For typical simulation problems like atmospheric high-enthalpy entry flows (assuming heavy-particle temperatures on the order of 10000 K) the relative error is in the order of 10(5)%.

  6. Probability Prediction in Multistate Survival Models for Patients with Chronic Myeloid Leukaemia

    Institute of Scientific and Technical Information of China (English)

    FANG Ya; Hein Putter

    2005-01-01

    In order to find an appropriate model suitable for a multistate survival experiment, 634 patients with chronic myeloid leukaemia (CML) were selected to illustrate the method of analysis.After transplantation, there were 4 possible situations for a patient: disease free, relapse but still alive, death before relapse, and death after relapse. The last 3 events were considered as treatment failure. The results showed that the risk of death before relapse was higher than that of the relapse,especially in the first year after transplantation with competing-risk method. The result of patients with relapse time less than 12 months was much poor by the Kaplan-Meier method. And the multistate survival models were developed, which were detailed and informative based on the analysis of competing risks and Kaplan-Meier analysis. With the multistate survival models, a further analysis on conditional probability was made for patients who were disease free and still alive at month 12 after transplantation. It was concluded that it was possible for an individual patient to predict the 4 possible probabilities at any time. Also the prognoses for relapse either death or not and death either before or afterrelapse may be given. Furthermore, the conditional probabilities for patients who were disease free and still alive in a given time after transplantation can be predicted.

  7. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Level Set Modeling of Transient Electromigration Grooving

    OpenAIRE

    Khenner, M.; Averbuch, A.; Israeli, M.; Nathan, M; Glickman, E.

    2000-01-01

    A numerical investigation of grain-boundary (GB) grooving by means of the Level Set (LS) method is carried out. GB grooving is emerging as a key element of electromigration drift in polycrystalline microelectronic interconnects, as evidenced by a number of recent studies. The purpose of the present study is to provide an efficient numerical simulation, allowing a parametric study of the effect of key physical parameters (GB and surface diffusivities, grain size, current density, etc) on the e...

  9. Calibrating perceived understanding and competency in probability concepts: A diagnosis of learning difficulties based on Rasch probabilistic model

    Science.gov (United States)

    Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md

    2015-12-01

    Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.

  10. Probability of ventricular fibrillation: allometric model based on the ST deviation

    Directory of Open Access Journals (Sweden)

    Arini Pedro D

    2011-01-01

    Full Text Available Abstract Background Allometry, in general biology, measures the relative growth of a part in relation to the whole living organism. Using reported clinical data, we apply this concept for evaluating the probability of ventricular fibrillation based on the electrocardiographic ST-segment deviation values. Methods Data collected by previous reports were used to fit an allometric model in order to estimate ventricular fibrillation probability. Patients presenting either with death, myocardial infarction or unstable angina were included to calculate such probability as, VFp = δ + β (ST, for three different ST deviations. The coefficients δ and β were obtained as the best fit to the clinical data extended over observational periods of 1, 6, 12 and 48 months from occurrence of the first reported chest pain accompanied by ST deviation. Results By application of the above equation in log-log representation, the fitting procedure produced the following overall coefficients: Average β = 0.46, with a maximum = 0.62 and a minimum = 0.42; Average δ = 1.28, with a maximum = 1.79 and a minimum = 0.92. For a 2 mm ST-deviation, the full range of predicted ventricular fibrillation probability extended from about 13% at 1 month up to 86% at 4 years after the original cardiac event. Conclusions These results, at least preliminarily, appear acceptable and still call for full clinical test. The model seems promising, especially if other parameters were taken into account, such as blood cardiac enzyme concentrations, ischemic or infarcted epicardial areas or ejection fraction. It is concluded, considering these results and a few references found in the literature, that the allometric model shows good predictive practical value to aid medical decisions.

  11. Modelling the 2013 North Aegean (Greece) seismic sequence: geometrical and frictional constraints, and aftershock probabilities

    Science.gov (United States)

    Karakostas, Vassilis; Papadimitriou, Eleftheria; Gospodinov, Dragomir

    2014-04-01

    The 2013 January 8 Mw 5.8 North Aegean earthquake sequence took place on one of the ENE-WSW trending parallel dextral strike slip fault branches in this area, in the continuation of 1968 large (M = 7.5) rupture. The source mechanism of the main event indicates predominantly strike slip faulting in agreement with what is expected from regional seismotectonics. It was the largest event to have occurred in the area since the establishment of the Hellenic Unified Seismological Network (HUSN), with an adequate number of stations in close distances and full azimuthal coverage, thus providing the chance of an exhaustive analysis of its aftershock sequence. The main shock was followed by a handful of aftershocks with M ≥ 4.0 and tens with M ≥ 3.0. Relocation was performed by using the recordings from HUSN and a proper crustal model for the area, along with time corrections in each station relative to the model used. Investigation of the spatial and temporal behaviour of seismicity revealed possible triggering of adjacent fault segments. Theoretical static stress changes from the main shock give a preliminary explanation for the aftershock distribution aside from the main rupture. The off-fault seismicity is perfectly explained if μ > 0.5 and B = 0.0, evidencing high fault friction. In an attempt to forecast occurrence probabilities of the strong events (Mw ≥ 5.0), estimations were performed following the Restricted Epidemic Type Aftershock Sequence (RETAS) model. The identified best-fitting MOF model was used to execute 1-d forecasts for such aftershocks and follow the probability evolution in time during the sequence. Forecasting was also implemented on the base of a temporal model of aftershock occurrence, different from the modified Omori formula (the ETAS model), which resulted in probability gain (though small) in strong aftershock forecasting for the beginning of the sequence.

  12. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  13. Probability of detection model for the non-destructive inspection of steam generator tubes of PWRs

    Science.gov (United States)

    Yusa, N.

    2017-06-01

    This study proposes a probability of detection (POD) model to discuss the capability of non-destructive testing methods for the detection of stress corrosion cracks appearing in the steam generator tubes of pressurized water reactors. Three-dimensional finite element simulations were conducted to evaluate eddy current signals due to stress corrosion cracks. The simulations consider an absolute type pancake probe and model a stress corrosion crack as a region with a certain electrical conductivity inside to account for eddy currents flowing across a flaw. The probabilistic nature of a non-destructive test is simulated by varying the electrical conductivity of the modelled stress corrosion cracking. A two-dimensional POD model, which provides the POD as a function of the depth and length of a flaw, is presented together with a conventional POD model characterizing a flaw using a single parameter. The effect of the number of the samples on the PODs is also discussed.

  14. The gap probability model for canopy thermal infrared emission with non-scattering approximation

    Institute of Scientific and Technical Information of China (English)

    牛铮; 柳钦火; 高彦春; 张庆员; 王长耀

    2000-01-01

    To describe canopy emitting thermal radiance precisely and physically is one of the key researches in retrieving land surface temperature (LSI) over vegetation-covered regions by remote sensing technology. This work is aimed at establishing gap probability models to describe the thermal emission characteristics in continuous plant, including the basic model and the sunlit model. They are suitable respectively in the nighttime and in the daytime. The sunlit model is the basic model plus a sunlit correcting item which takes the hot spot effect into account. The researches on the directional distribution of radiance and its relationship to canopy structural parameters, such as the leaf area index (LAI) and leaf angle distribution (LAD), were focused. The characteristics of directional radiance caused by temperature differences among components in canopy, such as those between leaf and soil, and between sunlit leaf or soil and shadowed leaf or soil, were analyzed. A well fitting between experimental data an

  15. A cellular automata model of traffic flow with variable probability of randomization

    Science.gov (United States)

    Zheng, Wei-Fan; Zhang, Ji-Ye

    2015-05-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow-density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. Project supported by the National Natural Science Foundation of China (Grant Nos. 11172247, 61273021, 61373009, and 61100118).

  16. Models for setting ATM parameter values

    DEFF Research Database (Denmark)

    Blaabjerg, Søren; Gravey, A.; Romæuf, L.

    1996-01-01

    In ATM networks, a user should negotiate at connection set-up a traffic contract which includes traffic characteristics and requested QoS. The traffic characteristics currently considered are the Peak Cell Rate, the Sustainable Cell Rate, the Intrinsic Burst Tolerance and the Cell Delay Variation...... to Network Interface (UNI) and at subsequent Inter Carrier Interfaces (ICIs), by algorithmic rules based on the Generic Cell Rate Algorithm (GCRA) formalism. Conformance rules are implemented by policing mechanisms that control the traffic submitted by the user and discard excess traffic. It is therefore...

  17. A Novel Probability Model for Suppressing Multipath Ghosts in GPR and TWI Imaging: A Numerical Study

    Directory of Open Access Journals (Sweden)

    Tan Yun-hua

    2015-10-01

    Full Text Available A novel concept for suppressing the problem of multipath ghosts in Ground Penetrating Radar (GPR and Through-Wall Imaging (TWI is presented. Ghosts (i.e., false targets mainly arise from the use of the Born or single-scattering approximations that lead to linearized imaging algorithms; however, these approximations neglect the effect of multiple scattering (or multipath between the electromagnetic wavefield and the object under investigation. In contrast to existing methods of suppressing multipath ghosts, the proposed method models for the first time the reflectivity of the probed objects as a probability function up to a normalized factor and introduces the concept of random subaperture by randomly picking up measurement locations from the entire aperture. Thus, the final radar image is a joint probability distribution that corresponds to radar images derived from multiple random subapertures. Finally, numerical experiments are used to demonstrate the performance of the proposed methodology in GPR and TWI imaging.

  18. Modelling the impact of creep on the probability of failure of a solid oxidefuel cell stack

    DEFF Research Database (Denmark)

    Greco, Fabio; Frandsen, Henrik Lund; Nakajo, Arata

    2014-01-01

    In solid oxide fuel cell (SOFC) technology a major challenge lies in balancing thermal stresses from an inevitable thermal field. The cells are known to creep, changing over time the stress field. The main objective of this study was to assess the influence of creep on the failure probability...... of an SOFC stack. A finite element analysis on a single repeating unit of the stack was performed, in which the influence of the mechanical interactions,the temperature-dependent mechanical properties and creep of the SOFC materials are considered. Moreover, stresses from the thermo-mechanical simulation...... of sintering of the cells have been obtained and were implemented into the model of the single repeating unit. The significance of the relaxation of the stresses by creep in the cell components and its influence on the probability of cell survival was investigated. Finally, the influence of cell size...

  19. An analytical expression for the exit probability of the q-voter model in one dimension

    CERN Document Server

    Timpanaro, André Martin

    2014-01-01

    We present in this paper an approximation that is able to give an analytical expression for the exit probability of the $q$-voter model in one dimension. This expression gives a better fit for the more recent data about simulations in large networks, and as such, departs from the expression $\\frac{\\rho^q}{\\rho^q + (1-\\rho)^q}$ found in papers that investigated small networks only. The approximation consists in assuming a large separation on the time scales at which active groups of agents convince inactive ones and the time taken in the competition between active groups. Some interesting findings are that for $q=2$ we still have $\\frac{\\rho^2}{\\rho^2 + (1-\\rho)^2}$ as the exit probability and for large values of $q$ the difference between the result and $\\frac{\\rho^q}{\\rho^q + (1-\\rho)^q}$ becomes negligible (the difference is maximum for $q=5$ and 6)

  20. Equivalent probability density moments determine equivalent epidemics in an SIRS model with temporary immunity.

    Science.gov (United States)

    Carr, Thomas W

    2017-02-01

    In an SIRS compartment model for a disease we consider the effect of different probability distributions for remaining immune. We show that to first approximation the first three moments of the corresponding probability densities are sufficient to well describe oscillatory solutions corresponding to recurrent epidemics. Specifically, increasing the fraction who lose immunity, increasing the mean immunity time, and decreasing the heterogeneity of the population all favor the onset of epidemics and increase their severity. We consider six different distributions, some symmetric about their mean and some asymmetric, and show that by tuning their parameters such that they have equivalent moments that they all exhibit equivalent dynamical behavior. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  2. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  3. ON THE PROBABILITY OF K-CONNECTIVITY IN WIRELESS AD HOC NETWORKS UNDER DIFFERENT MOBILITY MODELS

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2010-09-01

    Full Text Available We compare the probability of k-Connectivity of an ad hoc network under Random Way Point (RWP,City Section and Manhattan mobility models. A Network is said to be k-Connected if there exists at least kedge disjoint paths between any pair of nodes in that network at any given time and velocity. Initially, foreach of the three mobility models, the movement of the each node in the ad hoc network at a givenvelocity and time are captured and stored in the Node Movement Database (NMDB. Using themovements in the NMDB, the location of the node at a given time is computed and stored in the NodeLocation Database (NLDB. A weighted graph is created using the location of the nodes from NLDB,which is converted into a residual graph. The k-Connectivity of this residual graph is obtained by runningFord-Fulkerson’s algorithm on it. Ford Fulkerson’s algorithm computes the maximum flow of a networkby recording the flows assigned to different routes from each node to all the other nodes in the network.When run for a particular source-destination pair (s, d pair on a residual network graph with unit edgeweights as capacity, the maximum flow determined by Ford-Fulkerson’ algorithm is the number of edgedisjoint s-d paths on the network graph. Simulations show that the RWP model yields the highestprobability of k-Connectivity compared to City Section and Manhattan mobility models for a majority ofdifferent node densities and velocities considered. Simulation results also show that, for all the threemobility models, as the k value increases, the probability of k-Connectivity decreases for a given densityand velocity and as the density increases the probability of k-Connectivity increases.

  4. A measure on the set of compact Friedmann-Lemaitre-Robertson-Walker models

    Energy Technology Data Exchange (ETDEWEB)

    Roukema, Boudewijn F [Torun Centre for Astronomy, Nicolaus Copernicus University, ul. Gagarina 11, 87-100 Torun (Poland); Blanloeil, Vincent [IRMA, Departement de Mathematiques, Universite de Strasbourg, 7 rue Rene Descartes, 67084 Strasbourg, Cedex (France)

    2010-12-21

    Compact, flat Friedmann-Lemaitre-Robertson-Walker (FLRW) models have recently regained interest as a good fit to the observed cosmic microwave background temperature fluctuations. However, it is generally thought that a globally, exactly flat FLRW model is theoretically improbable. Here, in order to obtain a probability space on the set F of compact, comoving, 3-spatial sections of FLRW models, a physically motivated hypothesis is proposed, using the density parameter {Omega} as a derived rather than fundamental parameter. We assume that the processes that select the 3-manifold also select a global mass-energy and a Hubble parameter. The requirement that the local and global values of {Omega} are equal implies a range in {Omega} that consists of a single real value for any 3-manifold. Thus, the obvious measure over F is the discrete measure. Hence, if the global mass-energy and Hubble parameter are a function of 3-manifold choice among compact FLRW models, then probability spaces parametrized by {Omega} do not, in general, give a zero probability of a flat model. Alternatively, parametrization by a spatial size parameter, the injectivity radius r{sub inj}, suggests the Lebesgue measure. In this case, the probability space over the injectivity radius implies that flat models occur almost surely (a.s.), in the sense of probability theory, and non-flat models a.s. do not occur.

  5. Translational PK/PD modeling to increase probability of success in drug discovery and early development.

    Science.gov (United States)

    Lavé, Thierry; Caruso, Antonello; Parrott, Neil; Walz, Antje

    In this review we present ways in which translational PK/PD modeling can address opportunities to enhance probability of success in drug discovery and early development. This is achieved by impacting efficacy and safety-driven attrition rates, through increased focus on the quantitative understanding and modeling of translational PK/PD. Application of the proposed principles early in the discovery and development phases is anticipated to bolster confidence of successfully evaluating proof of mechanism in humans and ultimately improve Phase II success. The present review is centered on the application of predictive modeling and simulation approaches during drug discovery and early development, and more specifically of mechanism-based PK/PD modeling. Case studies are presented, focused on the relevance of M&S contributions to real-world questions and the impact on decision making.

  6. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  7. Probability density function modeling of scalar mixing from concentrated sources in turbulent channel flow

    CERN Document Server

    Bakosi, J; Boybeyi, Z; 10.1063/1.2803348

    2010-01-01

    Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth & Pope with Durbin's method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous transport with a non-local representation of the near-wall Reynolds stress anisotropy. The presence of walls is incorporated through the imposition of no-slip and impermeability conditions on particles without the use of damping or wall-functions. Information on the turbulent timescale is supplied by the gamma-distribution model of van Slooten et al. Two different micromixing models are compared that incorporate the effect of small scale mixing on the transported scalar: the widely used interaction by exchange with th...

  8. A probability model: Tritium release into the coolant of a light water tritium production reactor

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, D N

    1992-04-01

    This report presents a probability model of the total amount of tritium that will be released from a core of tritium target rods into the coolant of a light water reactor during a tritium production cycle.The model relates the total tritium released from a core to the release characteristics of an individual target rod within the core. The model captures total tritium release from two sources-release via target rod breach and release via permeation through the target rod. Specifically, under conservative assumptions about the breach characteristics of a target rod, total tritium released from a core is modeled as a function of the probability of a target breach and the mean and standard deviation of the permeation reduction factor (PRF) of an individual target rod. Two dominant facts emerge from the analysis in this report. First, total tritium release cannot be controlled and minimized solely through the PRF characteristics of a target rod. Tritium release via breach must be abated if acceptable tritium production is to be achieved. Second, PRF values have a saturation point to their effectiveness. Specifically, in the presence of any realistic level of PRF variability, increasing PRF values above approximately 1000 wig contribute little to minimizing total tritium release.

  9. Fast Outage Probability Simulation for FSO Links with a Generalized Pointing Error Model

    KAUST Repository

    Issaid, Chaouki Ben

    2017-02-07

    Over the past few years, free-space optical (FSO) communication has gained significant attention. In fact, FSO can provide cost-effective and unlicensed links, with high-bandwidth capacity and low error rate, making it an exciting alternative to traditional wireless radio-frequency communication systems. However, the system performance is affected not only by the presence of atmospheric turbulences, which occur due to random fluctuations in the air refractive index but also by the existence of pointing errors. Metrics, such as the outage probability which quantifies the probability that the instantaneous signal-to-noise ratio is smaller than a given threshold, can be used to analyze the performance of this system. In this work, we consider weak and strong turbulence regimes, and we study the outage probability of an FSO communication system under a generalized pointing error model with both a nonzero boresight component and different horizontal and vertical jitter effects. More specifically, we use an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results.

  10. Fixation Probability in a Two-Locus Model by the Ancestral Recombination–Selection Graph

    Science.gov (United States)

    Lessard, Sabin; Kermany, Amir R.

    2012-01-01

    We use the ancestral influence graph (AIG) for a two-locus, two-allele selection model in the limit of a large population size to obtain an analytic approximation for the probability of ultimate fixation of a single mutant allele A. We assume that this new mutant is introduced at a given locus into a finite population in which a previous mutant allele B is already segregating with a wild type at another linked locus. We deduce that the fixation probability increases as the recombination rate increases if allele A is either in positive epistatic interaction with B and allele B is beneficial or in no epistatic interaction with B and then allele A itself is beneficial. This holds at least as long as the recombination fraction and the selection intensity are small enough and the population size is large enough. In particular this confirms the Hill–Robertson effect, which predicts that recombination renders more likely the ultimate fixation of beneficial mutants at different loci in a population in the presence of random genetic drift even in the absence of epistasis. More importantly, we show that this is true from weak negative epistasis to positive epistasis, at least under weak selection. In the case of deleterious mutants, the fixation probability decreases as the recombination rate increases. This supports Muller’s ratchet mechanism to explain the accumulation of deleterious mutants in a population lacking recombination. PMID:22095080

  11. Standard fire behavior fuel models: a comprehensive set for use with Rothermel's surface fire spread model

    Science.gov (United States)

    Joe H. Scott; Robert E. Burgan

    2005-01-01

    This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

  12. Investigation of probability theory on Ising models with different four-spin interactions

    Science.gov (United States)

    Yang, Yuming; Teng, Baohua; Yang, Hongchun; Cui, Haijuan

    2017-10-01

    Based on probability theory, two types of three-dimensional Ising models with different four-spin interactions are studied. Firstly the partition function of the system is calculated by considering the local correlation of spins in a given configuration, and then the properties of the phase transition are quantitatively discussed with series expansion technique and numerical method. Meanwhile the rounding errors in this calculation is analyzed so that the possibly source of the error in the calculation based on the mean field theory is pointed out.

  13. Modeling Stress Strain Relationships and Predicting Failure Probabilities For Graphite Core Components

    Energy Technology Data Exchange (ETDEWEB)

    Duffy, Stephen [Cleveland State Univ., Cleveland, OH (United States)

    2013-09-09

    This project will implement inelastic constitutive models that will yield the requisite stress-strain information necessary for graphite component design. Accurate knowledge of stress states (both elastic and inelastic) is required to assess how close a nuclear core component is to failure. Strain states are needed to assess deformations in order to ascertain serviceability issues relating to failure, e.g., whether too much shrinkage has taken place for the core to function properly. Failure probabilities, as opposed to safety factors, are required in order to capture the bariability in failure strength in tensile regimes. The current stress state is used to predict the probability of failure. Stochastic failure models will be developed that can accommodate possible material anisotropy. This work will also model material damage (i.e., degradation of mechanical properties) due to radiation exposure. The team will design tools for components fabricated from nuclear graphite. These tools must readily interact with finite element software--in particular, COMSOL, the software algorithm currently being utilized by the Idaho National Laboratory. For the eleastic response of graphite, the team will adopt anisotropic stress-strain relationships available in COMSO. Data from the literature will be utilized to characterize the appropriate elastic material constants.

  14. A model to search for birth probabilities of mammal populations using fertility data

    Directory of Open Access Journals (Sweden)

    JR. Moreira

    Full Text Available A model was constructed to predict monthly birth probabilities using mammalian fertility data. We used a sample of 147 female capybaras (Hydrochoerus hydrochaeris hunted on a farm on Marajó Island, Brazil. In the model each month was treated as a multinomial with six cells representing the six possible reproductive states (five months gestation. A hypothesis test was carried out to see whether a cosine curve would fit the birth probabilities. The results offer no support for a seasonal component (F2,9 = 1.84, P = 0.21, whereas results from a direct census do (F3,23 = 87.29, P < 0.01. Some hunting techniques were biased towards killing pregnant females (χ(21= 7.2, P< 0.01, thereby spreading reproduction throughout the year (F2,9 = 1.84, P = 0.21. The model remained a powerful predictive tool to be used with mammalian fertility data as long as the data are not biased towards pregnant females.

  15. The Probabilities of Orbital-Companion Models for Stellar Radial Velocity Data

    CERN Document Server

    Hou, Fengji; Hogg, David W

    2014-01-01

    The fully marginalized likelihood, or Bayesian evidence, is of great importance in probabilistic data analysis, because it is involved in calculating the posterior probability of a model or re-weighting a mixture of models conditioned on data. It is, however, extremely challenging to compute. This paper presents a geometric-path Monte Carlo method, inspired by multi-canonical Monte Carlo to evaluate the fully marginalized likelihood. We show that the algorithm is very fast and easy to implement and produces a justified uncertainty estimate on the fully marginalized likelihood. The algorithm performs efficiently on a trial problem and multi-companion model fitting for radial velocity data. For the trial problem, the algorithm returns the correct fully marginalized likelihood, and the estimated uncertainty is also consistent with the standard deviation of results from multiple runs. We apply the algorithm to the problem of fitting radial velocity data from HIP 88048 ($\

  16. A multi-state model for wind farms considering operational outage probability

    DEFF Research Database (Denmark)

    Cheng, Lin; Liu, Manjun; Sun, Yuanzhang;

    2013-01-01

    power penetration levels. Therefore, a more comprehensive analysis toward WECS as well as an appropriate reliability assessment model are essential for maintaining the reliable operation of power systems. In this paper, the impact of wind turbine outage probability on system reliability is firstly......As one of the most important renewable energy resources, wind power has drawn much attention in recent years. The stochastic characteristics of wind speed lead to generation output uncertainties of wind energy conversion system (WECS) and affect power system reliability, especially at high wind...... developed by considering the following factors: running time, operating environment, operating conditions, and wind speed fluctuations. A multi-state model for wind farms is also established. Numerical results illustrate that the proposed model can be well applied to power system reliability assessment...

  17. A cellular automata traffic flow model considering the heterogeneity of acceleration and delay probability

    Science.gov (United States)

    Li, Qi-Lang; Wong, S. C.; Min, Jie; Tian, Shuo; Wang, Bing-Hong

    2016-08-01

    This study examines the cellular automata traffic flow model, which considers the heterogeneity of vehicle acceleration and the delay probability of vehicles. Computer simulations are used to identify three typical phases in the model: free-flow, synchronized flow, and wide moving traffic jam. In the synchronized flow region of the fundamental diagram, the low and high velocity vehicles compete with each other and play an important role in the evolution of the system. The analysis shows that there are two types of bistable phases. However, in the original Nagel and Schreckenberg cellular automata traffic model, there are only two kinds of traffic conditions, namely, free-flow and traffic jams. The synchronized flow phase and bistable phase have not been found.

  18. Error Probability Analysis of Free-Space Optical Links with Different Channel Model under Turbulent Condition

    CERN Document Server

    Barua, Bobby; Islam, Md Rezwan

    2012-01-01

    Free space optics (FSO) is a promising solution for the need to very high data rate point-to point communication. FSO communication technology became popular due to its large bandwidth potential, unlicensed spectrum, excellent security and quick and inexpensive setup. Unfortunately, atmospheric turbulence-induced fading is one of the main impairments affecting FSO communications. To design a high performance communication link for the atmospheric FSO channel, it is of great importance to characterize the channel with proper model. In this paper, the modulation format is Q-ary PPM across lasers, with intensity modulation and ideal photodetectors are assumed to investigate the most efficient PDF models for FSO communication under turbulent condition. The performance results are evaluated in terms of symbol error probability (SEP) for different type of channel model and the simulation results confirm the analytical findings.

  19. Simultaneous parameter and tolerance optimization of structures via probability-interval mixed reliability model

    DEFF Research Database (Denmark)

    Luo, Yangjun; Wu, Xiaoxiang; Zhou, Mingdong

    2015-01-01

    on a probability-interval mixed reliability model, the imprecision of design parameters is modeled as interval uncertainties fluctuating within allowable tolerance bounds. The optimization model is defined as to minimize the total manufacturing cost under mixed reliability index constraints, which are further...... transformed into their equivalent formulations by using the performance measure approach. The optimization problem is then solved with the sequential approximate programming. Meanwhile, a numerically stable algorithm based on the trust region method is proposed to efficiently update the target performance......Both structural sizes and dimensional tolerances strongly influence the manufacturing cost and the functional performance of a practical product. This paper presents an optimization method to simultaneously find the optimal combination of structural sizes and dimensional tolerances. Based...

  20. THE SURVIVAL PROBABILITY IN FINITE TIME PERIOD IN FULLY DISCRETE RISK MODEL

    Institute of Scientific and Technical Information of China (English)

    ChengShixue; WuBiao

    1999-01-01

    The probabilities of the following events are first discussed in this paper: the insurance company survives to any fixed time k and the surplus at time k equals x≥1. The formulas for calculating such probabilities are deduced through analytical and probabilistic arguments respectively. Finally, other probability laws relating to risk are determined based on the probabilities mentioned above.

  1. Linear Positivity and Virtual Probability

    CERN Document Server

    Hartle, J B

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...

  2. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.

  3. Classical signal model reproducing quantum probabilities for single and coincidence detections

    Science.gov (United States)

    Khrennikov, Andrei; Nilsson, Börje; Nordebo, Sven

    2012-05-01

    We present a simple classical (random) signal model reproducing Born's rule. The crucial point of our approach is that the presence of detector's threshold and calibration procedure have to be treated not as simply experimental technicalities, but as the basic counterparts of the theoretical model. We call this approach threshold signal detection model (TSD). The experiment on coincidence detection which was done by Grangier in 1986 [22] played a crucial role in rejection of (semi-)classical field models in favour of quantum mechanics (QM): impossibility to resolve the wave-particle duality in favour of a purely wave model. QM predicts that the relative probability of coincidence detection, the coefficient g(2) (0), is zero (for one photon states), but in (semi-)classical models g(2)(0) >= 1. In TSD the coefficient g(2)(0) decreases as 1/ɛ2d, where ɛd > 0 is the detection threshold. Hence, by increasing this threshold an experimenter can make the coefficient g(2) (0) essentially less than 1. The TSD-prediction can be tested experimentally in new Grangier type experiments presenting a detailed monitoring of dependence of the coefficient g(2)(0) on the detection threshold. Structurally our model has some similarity with the prequantum model of Grossing et al. Subquantum stochasticity is composed of the two counterparts: a stationary process in the space of internal degrees of freedom and the random walk type motion describing the temporal dynamics.

  4. Modeling Multisource-heterogeneous Information Based on Random Set and Fuzzy Set Theory

    Institute of Scientific and Technical Information of China (English)

    WEN Cheng-lin; XU Xiao-bin

    2006-01-01

    This paper presents a new idea, named as modeling multisensor-heterogeneous information, to incorporate the fuzzy logic methodologies with mulitsensor-multitarget system under the framework of random set theory. Firstly, based on strong random set and weak random set, the unified form to describe both data (unambiguous information) and fuzzy evidence (uncertain information) is introduced. Secondly, according to signatures of fuzzy evidence, two Bayesian-markov nonlinear measurement models are proposed to fuse effectively data and fuzzy evidence. Thirdly, by use of "the models-based signature-matching scheme", the operation of the statistics of fuzzy evidence defined as random set can be translated into that of the membership functions of relative point state variables. These works are the basis to construct qualitative measurement models and to fuse data and fuzzy evidence.

  5. Modeling Consideration Sets and Brand Choice Using Artificial Neural Networks

    NARCIS (Netherlands)

    B.L.K. Vroomen (Björn); Ph.H.B.F. Franses (Philip Hans); J.E.M. van Nierop

    2001-01-01

    textabstractThe concept of consideration sets makes brand choice a two-step process. House-holds first construct a consideration set which not necessarily includes all available brands and conditional on this set they make a final choice. In this paper we put forward a parametric econometric model f

  6. Probability Model of Hangzhou Bay Bridge Vehicle Loads Using Weigh-in-Motion Data

    Directory of Open Access Journals (Sweden)

    Dezhang Sun

    2015-01-01

    Full Text Available To study the vehicle load characteristics of bay bridges in China, especially truck loads, we performed a statistical analysis of the vehicle loads on Hangzhou Bay Bridge using more than 3 months of weigh-in-motion data from the site. The results showed that when all the vehicle samples were included in the statistical analysis, the histogram of the vehicles exhibited a multimodal distribution, which could not be fitted successfully by a familiar single probability distribution model. When the truck samples were analyzed, a characteristic multiple-peaked distribution with a main peak was obtained. The probability distribution of all vehicles was fitted using a weighting function with five normal distributions and the truck loads were modeled by a single normal distribution. The results demonstrated the good fits with the histogram. The histograms of different time periods were also analyzed. The results showed that the traffic mainly comprised two-axle small vehicles during the rush hours in the morning and the evening, and the histogram could be fitted approximately using three normal distribution functions. And the maximum value distributions of vehicles during the design life of the bay bridge were predicted by maximum value theory.

  7. Probability distribution of financial returns in a model of multiplicative Brownian motion with stochastic diffusion coefficient

    Science.gov (United States)

    Silva, Antonio

    2005-03-01

    It is well-known that the mathematical theory of Brownian motion was first developed in the Ph. D. thesis of Louis Bachelier for the French stock market before Einstein [1]. In Ref. [2] we studied the so-called Heston model, where the stock-price dynamics is governed by multiplicative Brownian motion with stochastic diffusion coefficient. We solved the corresponding Fokker-Planck equation exactly and found an analytic formula for the time-dependent probability distribution of stock price changes (returns). The formula interpolates between the exponential (tent-shaped) distribution for short time lags and the Gaussian (parabolic) distribution for long time lags. The theoretical formula agrees very well with the actual stock-market data ranging from the Dow-Jones index [2] to individual companies [3], such as Microsoft, Intel, etc. [] [1] Louis Bachelier, ``Th'eorie de la sp'eculation,'' Annales Scientifiques de l''Ecole Normale Sup'erieure, III-17:21-86 (1900).[] [2] A. A. Dragulescu and V. M. Yakovenko, ``Probability distribution of returns in the Heston model with stochastic volatility,'' Quantitative Finance 2, 443--453 (2002); Erratum 3, C15 (2003). [cond-mat/0203046] [] [3] A. C. Silva, R. E. Prange, and V. M. Yakovenko, ``Exponential distribution of financial returns at mesoscopic time lags: a new stylized fact,'' Physica A 344, 227--235 (2004). [cond-mat/0401225

  8. Probability modeling of the number of positive cores in a prostate cancer biopsy session, with applications.

    Science.gov (United States)

    Serfling, Robert; Ogola, Gerald

    2016-02-10

    Among men, prostate cancer (CaP) is the most common newly diagnosed cancer and the second leading cause of death from cancer. A major issue of very large scale is avoiding both over-treatment and under-treatment of CaP cases. The central challenge is deciding clinical significance or insignificance when the CaP biopsy results are positive but only marginally so. A related concern is deciding how to increase the number of biopsy cores for larger prostates. As a foundation for improved choice of number of cores and improved interpretation of biopsy results, we develop a probability model for the number of positive cores found in a biopsy, given the total number of cores, the volumes of the tumor nodules, and - very importantly - the prostate volume. Also, three applications are carried out: guidelines for the number of cores as a function of prostate volume, decision rules for insignificant versus significant CaP using number of positive cores, and, using prior distributions on total tumor size, Bayesian posterior probabilities for insignificant CaP and posterior median CaP. The model-based results have generality of application, take prostate volume into account, and provide attractive tradeoffs of specificity versus sensitivity. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Local probability model for the Bell correlation based on the statistics of chaotic light and non-commutative processes

    CERN Document Server

    Sica, Louis

    2011-01-01

    As discussed below, Bell's inequalities and experimental results rule out commutative hidden variable models as a basis for Bell correlations, but not necessarily non-commutative probability models. A local probability model is constructed for Bell correlations based on non-commutative operations involving polarizers. As in the entanglement model, the Bell correlation is obtained from a probability calculus without explicit use of deterministic hidden variables. The probability calculus used is associated with chaotic light. Joint wave intensity correlations at spatially separated polarization analyzers are computed using common information originating at the source. When interpreted as photon count rates, these yield quantum mechanical joint probabilities after the contribution of indeterminate numbers of photon pairs greater than one is subtracted out. The formalism appears to give a local account of Bell correlations.

  10. Classical signal model reproducing quantum probabilities for single and coincidence detections

    CERN Document Server

    Khrennikov, Andrei; Nordebo, Sven

    2011-01-01

    We present a simple classical (random) signal model reproducing Born's rule. The crucial point of our approach is that the presence of detector's threshold and calibration procedure have to be treated not as simply experimental technicalities, but as the basic counterparts of the theoretical model. We call this approach threshold signal detection model (TSD). The experiment on coincidence detection which was done by Grangier in 1986 \\cite{Grangier} played a crucial role in rejection of (semi-)classical field models in favor of quantum mechanics (QM): impossibility to resolve the wave-particle duality in favor of a purely wave model. QM predicts that the relative probability of coincidence detection, the coefficient $g^{(2)}(0),$ is zero (for one photon states), but in (semi-)classical models $g^{(2)}(0)\\geq 1.$ In TSD the coefficient $g^{(2)}(0)$ decreases as $1/{\\cal E}_d^2,$ where ${\\cal E}_d>0$ is the detection threshold. Hence, by increasing this threshold an experimenter can make the coefficient $g^{(2)}...

  11. The gap probability model for canopy thermal infrared emission with non-scattering approximation

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    To describe canopy emitting thermal radiance precisely and physically is one of the key researches in retrieving land surface temperature (LST) over vegetation-covered regions by remote sensing technology.This work is aimed at establishing gap probability models to describe the thermal emission characteristics in continuous plant,including the basic model and the sunlit model.They are suitable respectively in the nighttime and in the daytime.The sunlit model is the basic model plus a sunlit correcting item which takes the hot spot effect into account.The researches on the directional distribution of radiance and its relationship to canopy structural parameters,such as the leaf area index (LAI) and leaf angle distribution (LAD),were focused.The characteristics of directional radiance caused by temperature differences among components in canopy,such as those between leaf and soil,and between sunlit leaf or soil and shadowed leaf or soil,were analyzed.A well fitting between experimental data and the theoretical calculations shows that the models are able to illustrate the canopy thermal emission generally.

  12. Modeling Portfolio Optimization Problem by Probability-Credibility Equilibrium Risk Criterion

    Directory of Open Access Journals (Sweden)

    Ye Wang

    2016-01-01

    Full Text Available This paper studies the portfolio selection problem in hybrid uncertain decision systems. Firstly the return rates are characterized by random fuzzy variables. The objective is to maximize the total expected return rate. For a random fuzzy variable, this paper defines a new equilibrium risk value (ERV with credibility level beta and probability level alpha. As a result, our portfolio problem is built as a new random fuzzy expected value (EV model subject to ERV constraint, which is referred to as EV-ERV model. Under mild assumptions, the proposed EV-ERV model is a convex programming problem. Furthermore, when the possibility distributions are triangular, trapezoidal, and normal, the EV-ERV model can be transformed into its equivalent deterministic convex programming models, which can be solved by general purpose optimization software. To demonstrate the effectiveness of the proposed equilibrium optimization method, some numerical experiments are conducted. The computational results and comparison study demonstrate that the developed equilibrium optimization method is effective to model portfolio selection optimization problem with twofold uncertain return rates.

  13. Consideration sets, intentions and the inclusion of "don't know" in a two-stage model for voter choice

    NARCIS (Netherlands)

    Paap, R; van Nierop, E; van Heerde, HJ; Wedel, M; Franses, PH; Alsem, KJ

    2005-01-01

    We present a statistical model for voter choice that incorporates a consideration set stage and final vote intention stage. The first stage involves a multivariate probit (MVP) model to describe the probabilities that a candidate or a party gets considered. The second stage of the model is a

  14. A Generalized Rough Set Modeling Method for Welding Process

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Modeling is essential, significant and difficult for the quality and shaping control of arc welding process. A generalized rough set based modeling method was brought forward and a dynamic predictive model for pulsed gas tungsten arc welding (GTAW) was obtained by this modeling method. The results show that this modeling method can well acquire knowledge in welding and satisfy the real life application. In addition, the results of comparison between classic rough set model and back-propagation neural network model respectively are also satisfying.

  15. Building self-consistent, short-term earthquake probability (STEP models: improved strategies and calibration procedures

    Directory of Open Access Journals (Sweden)

    Damiano Monelli

    2010-11-01

    Full Text Available We present here two self-consistent implementations of a short-term earthquake probability (STEP model that produces daily seismicity forecasts for the area of the Italian national seismic network. Both implementations combine a time-varying and a time-invariant contribution, for which we assume that the instrumental Italian earthquake catalog provides the best information. For the time-invariant contribution, the catalog is declustered using the clustering technique of the STEP model; the smoothed seismicity model is generated from the declustered catalog. The time-varying contribution is what distinguishes the two implementations: 1 for one implementation (STEP-LG, the original model parameterization and estimation is used; 2 for the other (STEP-NG, the mean abundance method is used to estimate aftershock productivity. In the STEP-NG implementation, earthquakes with magnitude up to ML= 6.2 are expected to be less productive compared to the STEP-LG implementation, whereas larger earthquakes are expected to be more productive. We have retrospectively tested the performance of these two implementations and applied likelihood tests to evaluate their consistencies with observed earthquakes. Both of these implementations were consistent with the observed earthquake data in space: STEP-NG performed better than STEP-LG in terms of forecast rates. More generally, we found that testing earthquake forecasts issued at regular intervals does not test the full power of clustering models, and future experiments should allow for more frequent forecasts starting at the times of triggering events.

  16. A low false negative filter for detecting rare bird species from short video segments using a probable observation data set-based EKF method.

    Science.gov (United States)

    Song, Dezhen; Xu, Yiliang

    2010-09-01

    We report a new filter to assist the search for rare bird species. Since a rare bird only appears in front of a camera with very low occurrence (e.g., less than ten times per year) for very short duration (e.g., less than a fraction of a second), our algorithm must have a very low false negative rate. We verify the bird body axis information with the known bird flying dynamics from the short video segment. Since a regular extended Kalman filter (EKF) cannot converge due to high measurement error and limited data, we develop a novel probable observation data set (PODS)-based EKF method. The new PODS-EKF searches the measurement error range for all probable observation data that ensures the convergence of the corresponding EKF in short time frame. The algorithm has been extensively tested using both simulated inputs and real video data of four representative bird species. In the physical experiments, our algorithm has been tested on rock pigeons and red-tailed hawks with 119 motion sequences. The area under the ROC curve is 95.0%. During the one-year search of ivory-billed woodpeckers, the system reduces the raw video data of 29.41 TB to only 146.7 MB (reduction rate 99.9995%).

  17. An extended macro traffic flow model accounting for multiple optimal velocity functions with different probabilities

    Science.gov (United States)

    Cheng, Rongjun; Ge, Hongxia; Wang, Jufeng

    2017-08-01

    Due to the maximum velocity and safe headway distance of the different vehicles are not exactly the same, an extended macro model of traffic flow with the consideration of multiple optimal velocity functions with probabilities is proposed in this paper. By means of linear stability theory, the new model's linear stability condition considering multiple probabilities optimal velocity is obtained. The KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line through nonlinear analysis. The numerical simulations of influences of multiple maximum velocities and multiple safety distances on model's stability and traffic capacity are carried out. The cases of two different kinds of maximum speeds with same safe headway distance, two different types of safe headway distances with same maximum speed and two different max velocities and two different time-gaps are all explored by numerical simulations. First cases demonstrate that when the proportion of vehicles with a larger vmax increase, the traffic tends to unstable, which also means that jerk and brakes is not conducive to traffic stability and easier to result in stop and go phenomenon. Second cases show that when the proportion of vehicles with greater safety spacing increases, the traffic tends to be unstable, which also means that too cautious assumptions or weak driving skill is not conducive to traffic stability. Last cases indicate that increase of maximum speed is not conducive to traffic stability, while reduction of the safe headway distance is conducive to traffic stability. Numerical simulation manifests that the mixed driving and traffic diversion does not have effect on the traffic capacity when traffic density is low or heavy. Numerical results also show that mixed driving should be chosen to increase the traffic capacity when the traffic density is lower, while the traffic diversion should be chosen to increase the traffic capacity when

  18. Modeling and mapping the probability of occurrence of invasive wild pigs across the contiguous United States.

    Directory of Open Access Journals (Sweden)

    Meredith L McClure

    Full Text Available Wild pigs (Sus scrofa, also known as wild swine, feral pigs, or feral hogs, are one of the most widespread and successful invasive species around the world. Wild pigs have been linked to extensive and costly agricultural damage and present a serious threat to plant and animal communities due to their rooting behavior and omnivorous diet. We modeled the current distribution of wild pigs in the United States to better understand the physiological and ecological factors that may determine their invasive potential and to guide future study and eradication efforts. Using national-scale wild pig occurrence data reported between 1982 and 2012 by wildlife management professionals, we estimated the probability of wild pig occurrence across the United States using a logistic discrimination function and environmental covariates hypothesized to influence the distribution of the species. Our results suggest the distribution of wild pigs in the U.S. was most strongly limited by cold temperatures and availability of water, and that they were most likely to occur where potential home ranges had higher habitat heterogeneity, providing access to multiple key resources including water, forage, and cover. High probability of occurrence was also associated with frequent high temperatures, up to a high threshold. However, this pattern is driven by pigs' historic distribution in warm climates of the southern U.S. Further study of pigs' ability to persist in cold northern climates is needed to better understand whether low temperatures actually limit their distribution. Our model highlights areas at risk of invasion as those with habitat conditions similar to those found in pigs' current range that are also near current populations. This study provides a macro-scale approach to generalist species distribution modeling that is applicable to other generalist and invasive species.

  19. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Hennawi, Joseph F. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); McMahon, Richard G. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Schiminovich, David [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Sheldon, Erin S. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Brinkmann, Jon [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Schneider, Donald P., E-mail: jo.bovy@nyu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States)

    2012-04-10

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques-which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data-and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  20. Photometric redshifts and quasar probabilities from a single, data-driven generative model

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo [New York Univ. (NYU), NY (United States); Myers, Adam D. [Univ. of Wyoming, Laramie, WY (United States); Max Planck Inst. for Medical Research, Heidelberg (Germany); Hennawi, Joseph F. [Max Planck Inst. for Medical Research, Heidelberg (Germany); Hogg, David W. [Max Planck Inst. for Medical Research, Heidelberg (Germany); New York Univ. (NYU), NY (United States); McMahon, Richard G. [Univ. of Cambridge (United Kingdom); Schiminovich, David [Columbia Univ., New York, NY (United States); Sheldon, Erin S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brinkmann, Jon [Apache Point Observatory and New Mexico State Univ., Sunspot, NM (United States); Schneider, Donald P. [Pennsylvania State Univ., University Park, PA (United States); Weaver, Benjamin A. [New York Univ. (NYU), NY (United States)

    2012-03-20

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  1. Ruin probability in a risk model with variable premium intensity and risky investments

    Directory of Open Access Journals (Sweden)

    Yuliya Mishura

    2015-01-01

    the infinite-horizon ruin probability. To this end, we allow the surplus process to explode and investigate the question concerning the probability of explosion of the surplus process between claim arrivals.

  2. Relativistic calculations of charge transfer probabilities in U92+ - U91+(1s) collisions using the basis set of cubic Hermite splines

    CERN Document Server

    Maltsev, I A; Tupitsyn, I I; Shabaev, V M; Kozhedub, Y S; Plunien, G; Stoehlker, Th

    2013-01-01

    A new approach for solving the time-dependent two-center Dirac equation is presented. The method is based on using the finite basis set of cubic Hermite splines on a two-dimensional lattice. The Dirac equation is treated in rotating reference frame. The collision of U92+ (as a projectile) and U91+ (as a target) is considered at energy E_lab=6 MeV/u. The charge transfer probabilities are calculated for different values of the impact parameter. The obtained results are compared with the previous calculations [I. I. Tupitsyn et al., Phys. Rev. A 82, 042701 (2010)], where a method based on atomic-like Dirac-Sturm orbitals was employed. This work can provide a new tool for investigation of quantum electrodynamics effects in heavy-ion collisions near the supercritical regime.

  3. Compound nucleus formation probability PCN defined within the dynamical cluster-decay model

    Science.gov (United States)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-01-01

    With in the dynamical cluster-decay model (DCM), the compound nucleus fusion/ formation probability PCN is defined for the first time, and its variation with CN excitation energy E* and fissility parameter χ is studied. In DCM, the (total) fusion cross section σfusion is sum of the compound nucleus (CN) and noncompound nucleus (nCN) decay processes, each calculated as the dynamical fragmentation process. The CN cross section σCN is constituted of the evaporation residues (ER) and fusion-fission (ff), including the intermediate mass fragments (IMFs), each calculated for all contributing decay fragments (A1, A2) in terms of their formation and barrier penetration probabilities P0 and P. The nCN cross section σnCN is determined as the quasi-fission (qf) process where P0=1 and P is calculated for the entrance channel nuclei. The calculations are presented for six different target-projectile combinations of CN mass A~100 to superheavy, at various different center-of-mass energies with effects of deformations and orientations of nuclei included in it. Interesting results are that the PCN=1 for complete fusion, but PCN <1 or ≪1 due to the nCN conribution, depending strongly on both E* and χ.

  4. Compound nucleus formation probability PCN defined within the dynamical cluster-decay model

    Directory of Open Access Journals (Sweden)

    Chopra Sahila

    2015-01-01

    Full Text Available With in the dynamical cluster-decay model (DCM, the compound nucleus fusion/ formation probability PCN is defined for the first time, and its variation with CN excitation energy E* and fissility parameter χ is studied. In DCM, the (total fusion cross section σfusion is sum of the compound nucleus (CN and noncompound nucleus (nCN decay processes, each calculated as the dynamical fragmentation process. The CN cross section σCN is constituted of the evaporation residues (ER and fusion-fission (ff, including the intermediate mass fragments (IMFs, each calculated for all contributing decay fragments (A1, A2 in terms of their formation and barrier penetration probabilities P0 and P. The nCN cross section σnCN is determined as the quasi-fission (qf process where P0=1 and P is calculated for the entrance channel nuclei. The calculations are presented for six different target-projectile combinations of CN mass A~100 to superheavy, at various different center-of-mass energies with effects of deformations and orientations of nuclei included in it. Interesting results are that the PCN=1 for complete fusion, but PCN <1 or ≪1 due to the nCN conribution, depending strongly on both E* and χ.

  5. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  6. The IIASA set of energy models: Its design and application

    Science.gov (United States)

    Basile, P. S.; Agnew, M.; Holzl, A.; Kononov, Y.; Papin, A.; Rogner, H. H.; Schrattenholzer, L.

    1980-12-01

    The models studied include an accounting framework type energy demand model, a dynamic linear programming energy supply and conversion system model, an input-output model, a macroeconomic model, and an oil trade gaming model. They are incorporated in an integrated set for long-term, global analyses. This set makes use of a highly iterative process for energy scenario projections and analyses. Each model is quite simple and straightforward in structure; a great deal of human judgement is necessary in applying the set. The models are applied to study two alternative energy scenarios for a coming fifty year period. Examples are presented revealing the wealth of information that can be obtained from multimodel techniques. Details are given for several models (equations employed, assumptions made, data used).

  7. Enterprise Projects Set Risk Element Transmission Chaotic Genetic Model

    Directory of Open Access Journals (Sweden)

    Cunbin Li

    2012-08-01

    Full Text Available In order to research projects set risk transfer process and improve risk management efficiency in projects management, combining chaos theory and genetic algorithm, put forward enterprise projects set risk element transmission chaos genetic model. Using logistic chaos mapping and chebyshev chaos mapping mixture, constructed a hybrid chaotic mapping system. The steps of adopting hybrid chaos mapping for genetic operation include projects set initialization, calculation of fitness, selection, crossover and mutation operators, fitness adjustment and condition judgment. The results showed that the model can simulate enterprise projects set risk transmission process very well and it also provides the basis for the enterprise managers to make decisions.

  8. Optimal models with maximizing probability of first achieving target value in the preceding stages

    Institute of Scientific and Technical Information of China (English)

    林元烈; 伍从斌; 康波大

    2003-01-01

    Decision makers often face the need of performance guarantee with some sufficiently high proba-bility. Such problems can be modelled using a discrete time Markov decision process (MDP) with a probabilitycriterion for the first achieving target value. The objective is to find a policy that maximizes the probabilityof the total discounted reward exceeding a target value in the preceding stages. We show that our formula-tion cannot be described by former models with standard criteria. We provide the properties of the objectivefunctions, optimal value functions and optimal policies. An algorithm for computing the optimal policies forthe finite horizon case is given. In this stochastic stopping model, we prove that there exists an optimal deter-ministic and stationary policy and the optimality equation has a unique solution. Using perturbation analysis,we approximate general models and prove the existence of ε-optimal policy for finite state space. We give anexample for the reliability of the satellite systems using the above theory. Finally, we extend these results tomore general cases.

  9. Global climate change model natural climate variation: Paleoclimate data base, probabilities and astronomic predictors

    Energy Technology Data Exchange (ETDEWEB)

    Kukla, G.; Gavin, J. [Columbia Univ., Palisades, NY (United States). Lamont-Doherty Geological Observatory

    1994-05-01

    This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.

  10. An extended SMLD approach for presumed probability density function in flamelet combustion model

    CERN Document Server

    Coclite, Alessandro; De Palma, Pietro; Cutrone, Luigi

    2013-01-01

    This paper provides an extension of the standard flamelet progress variable (FPV) approach for turbulent combustion, applying the statistically most likely distribution (SMLD) framework to the joint PDF of the mixture fraction, Z, and the progress variable, C. In this way one does not need to make any assumption about the statistical correlation between Z and C and about the behaviour of the mixture fraction, as required in previous FPV models. In fact, for state-of-the-art models, with the assumption of very-fast-chemistry,Z is widely accepted to behave as a passive scalar characterized by a $\\beta$-distribution function. Instead, the model proposed here, evaluates the most probable joint distribution of Z and C without any assumption on their behaviour and provides an effective tool to verify the adequateness of widely used hypotheses, such as their statistical independence. The model is validated versus three well-known test cases, namely, the Sandia flames. The results are compared with those obtained by ...

  11. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    Science.gov (United States)

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  12. A software for the estimation of binding parameters of biochemical equilibria based on statistical probability model.

    Science.gov (United States)

    Fisicaro, E; Braibanti, A; Sambasiva Rao, R; Compari, C; Ghiozzi, A; Nageswara Rao, G

    1998-04-01

    An algorithm is proposed for the estimation of binding parameters for the interaction of biologically important macromolecules with smaller ones from electrometric titration data. The mathematical model is based on the representation of equilibria in terms of probability concepts of statistical molecular thermodynamics. The refinement of equilibrium concentrations of the components and estimation of binding parameters (log site constant and cooperativity factor) is performed using singular value decomposition, a chemometric technique which overcomes the general obstacles due to near singularity. The present software is validated with a number of biochemical systems of varying number of sites and cooperativity factors. The effect of random errors of realistic magnitude in experimental data is studied using the simulated primary data for some typical systems. The safe area within which approximate binding parameters ensure convergence has been reported for the non-self starting optimization algorithms.

  13. Analytical computation of the magnetization probability density function for the harmonic 2D XY model

    CERN Document Server

    Palma, G

    2009-01-01

    The probability density function (PDF) of some global average quantity plays a fundamental role in critical and highly correlated systems. We explicitly compute this quantity as a function of the magnetization for the two dimensional XY model in its harmonic approximation. Numerical simulations and perturbative results have shown a Gumbel-like shape of the PDF, in spite of the fact that the average magnetization is not an extreme variable. Our analytical result allows to test both perturbative analytical expansions and also numerical computations performed previously. Perfect agreement is found for the first moments of the PDF. Also for large volume and in the high temperature limit the distribution becomes Gaussian, as it should be. In the low temperature regime its numerical evaluation is compatible with a Gumbel distribution.

  14. Weibull Probability Model for Fracture Strength of Aluminium (1101)-Alumina Particle Reinforced Metal Matrix Composite

    Institute of Scientific and Technical Information of China (English)

    A.Suresh Babu; V.Jayabalan

    2009-01-01

    In recent times, conventional materials are replaced by metal matrix composites (MMCs) due to their high specific strength and modulus.Strength reliability, one of the key factors restricting wider use of composite materials in various applications, is commonly characterized by Weibull strength distribution function.In the present work, statistical analysis of the strength data of 15% volume alumina particle (mean size 15 μm)reinforced in aluminum alloy (1101 grade alloy) fabricated by stir casting method was carried out using Weibull probability model.Twelve tension tests were performed according to ASTM B577 standards and the test data, the corresponding Weibull distribution was obtained.Finally the reliability of the composite behavior in terms of its fracture strength was presented to ensure the reliability of composites for suitable applications.An important implication of the present study is that the Weibull distribution describes the experimentally measured strength data more appropriately.

  15. Probability modeling for robustness of multivariate LQG designing based on ship lateral motion

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The robustness of LQG designing for latitudinal movement of ship is mainly discussed, when its hydrodynamic parameters fluctuate around criterion value at random on the proportional distributing. When a given ship state at the speed of 18 kn and the course of 45° under Rank 5 state of sea, and the hydrodynamic parameters of the ship fluctuate at random on the proportional distributing with a range of ±10%,±20%,±30%, the robustness of multivariate LQG designing for ship is analyzed with applying the probability modeling of relative controlling effect. The result of simulating shows that when the hydrodynamic parameters of ship fluctuates the relative controlling effect of the LQG designing submit to normal distribution and the mean value of relative controlling effect has no remarkable changes comparing to that without perturbation of hydrodynamic parameter.

  16. Improving normal tissue complication probability models: the need to adopt a "data-pooling" culture.

    Science.gov (United States)

    Deasy, Joseph O; Bentzen, Søren M; Jackson, Andrew; Ten Haken, Randall K; Yorke, Ellen D; Constine, Louis S; Sharma, Ashish; Marks, Lawrence B

    2010-03-01

    Clinical studies of the dependence of normal tissue response on dose-volume factors are often confusingly inconsistent, as the QUANTEC reviews demonstrate. A key opportunity to accelerate progress is to begin storing high-quality datasets in repositories. Using available technology, multiple repositories could be conveniently queried, without divulging protected health information, to identify relevant sources of data for further analysis. After obtaining institutional approvals, data could then be pooled, greatly enhancing the capability to construct predictive models that are more widely applicable and better powered to accurately identify key predictive factors (whether dosimetric, image-based, clinical, socioeconomic, or biological). Data pooling has already been carried out effectively in a few normal tissue complication probability studies and should become a common strategy.

  17. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  18. IMPROVING NORMAL TISSUE COMPLICATION PROBABILITY MODELS: THE NEED TO ADOPT A “DATA-POOLING” CULTURE

    Science.gov (United States)

    Deasy, Joseph O.; Bentzen, Søren M.; Jackson, Andrew; Ten Haken, Randall K.; Yorke, Ellen D.; Constine, Louis S.; Sharma, Ashish; Marks, Lawrence B.

    2010-01-01

    Clinical studies of the dependence of normal tissue response on dose-volume factors are often confusingly inconsistent, as the QUANTEC reviews demonstrate. A key opportunity to accelerate progress is to begin storing high-quality datasets in repositories. Using available technology, multiple repositories could be conveniently queried, without divulging protected health information, to identify relevant sources of data for further analysis. After obtaining institutional approvals, data could then be pooled, greatly enhancing the capability to construct predictive models that are more widely applicable and better powered to accurately identify key predictive factors (whether dosimetric, image-based, clinical, socioeconomic, or biological). Data pooling has already been carried out effectively in a few normal tissue complication probability studies and should become a common strategy. PMID:20171511

  19. A generative probability model of joint label fusion for multi-atlas based brain segmentation.

    Science.gov (United States)

    Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang

    2014-08-01

    Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling

  20. A bottleneck model of set-specific capture.

    Science.gov (United States)

    Moore, Katherine Sledge; Weissman, Daniel H

    2014-01-01

    Set-specific contingent attentional capture is a particularly strong form of capture that occurs when multiple attentional sets guide visual search (e.g., "search for green letters" and "search for orange letters"). In this type of capture, a potential target that matches one attentional set (e.g. a green stimulus) impairs the ability to identify a temporally proximal target that matches another attentional set (e.g. an orange stimulus). In the present study, we investigated whether set-specific capture stems from a bottleneck in working memory or from a depletion of limited resources that are distributed across multiple attentional sets. In each trial, participants searched a rapid serial visual presentation (RSVP) stream for up to three target letters (T1-T3) that could appear in any of three target colors (orange, green, or lavender). The most revealing findings came from trials in which T1 and T2 matched different attentional sets and were both identified. In these trials, T3 accuracy was lower when it did not match T1's set than when it did match, but only when participants failed to identify T2. These findings support a bottleneck model of set-specific capture in which a limited-capacity mechanism in working memory enhances only one attentional set at a time, rather than a resource model in which processing capacity is simultaneously distributed across multiple attentional sets.

  1. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  2. Estuarine Back-barrier Shoreline and Beach Sandline Change Model Skill and Predicted Probabilities: Long-term sandline change

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...

  3. Estuarine Back-barrier Shoreline and Beach Sandline Change Model Skill and Predicted Probabilities: Event-driven backshore shoreline change

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...

  4. Estuarine Back-barrier Shoreline and Beach Sandline Change Model Skill and Predicted Probabilities: Event-driven beach sandline change

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...

  5. Ensemble forecasting of sub-seasonal to seasonal streamflow by a Bayesian joint probability modelling approach

    Science.gov (United States)

    Zhao, Tongtiegang; Schepen, Andrew; Wang, Q. J.

    2016-10-01

    The Bayesian joint probability (BJP) modelling approach is used operationally to produce seasonal (three-month-total) ensemble streamflow forecasts in Australia. However, water resource managers are calling for more informative sub-seasonal forecasts. Taking advantage of BJP's capability of handling multiple predictands, ensemble forecasting of sub-seasonal to seasonal streamflows is investigated for 23 catchments around Australia. Using antecedent streamflow and climate indices as predictors, monthly forecasts are developed for the three-month period ahead. Forecast reliability and skill are evaluated for the period 1982-2011 using a rigorous leave-five-years-out cross validation strategy. BJP ensemble forecasts of monthly streamflow volumes are generally reliable in ensemble spread. Forecast skill, relative to climatology, is positive in 74% of cases in the first month, decreasing to 57% and 46% respectively for streamflow forecasts for the final two months of the season. As forecast skill diminishes with increasing lead time, the monthly forecasts approach climatology. Seasonal forecasts accumulated from monthly forecasts are found to be similarly skilful to forecasts from BJP models based on seasonal totals directly. The BJP modelling approach is demonstrated to be a viable option for producing ensemble time-series sub-seasonal to seasonal streamflow forecasts.

  6. Probability-changing cluster algorithm for two-dimensional XY and clock models

    Science.gov (United States)

    Tomita, Yusuke; Okabe, Yutaka

    2002-05-01

    We extend the newly proposed probability-changing cluster (PCC) Monte Carlo algorithm to the study of systems with the vector order parameter. Wolff's idea of the embedded cluster formalism is used for assigning clusters. The Kosterlitz-Thouless (KT) transitions for the two-dimensional (2D) XY and q-state clock models are studied by using the PCC algorithm. Combined with the finite-size scaling analysis based on the KT form of the correlation length, ξ~exp(c/(T/TKT-1)), we determine the KT transition temperature and the decay exponent η as TKT=0.8933(6) and η=0.243(4) for the 2D XY model. We investigate two transitions of the KT type for the 2D q-state clock models with q=6,8,12 and confirm the prediction of η=4/q2 at T1, the low-temperature critical point between the ordered and XY-like phases, systematically.

  7. Fate modelling of chemical compounds with incomplete data sets

    DEFF Research Database (Denmark)

    Birkved, Morten; Heijungs, Reinout

    2011-01-01

    , and to provide simplified proxies for the more complicated “real”model relationships. In the presented study two approaches for the reduction of the data demand associated with characterization of chemical emissions in USEtoxTM are tested: The first approach yields a simplified set of mode of entry specific meta......-models with a data demand of app. 63 % (5/8) of the USEtoxTM characterization model. The second yields a simplified set of mode of entry specific meta-models with a data demand of 75 % (6/8) of the original model. The results of the study indicate that it is possible to simplify characterization models and lower...... the data demand of these models applying the presented approach. The results further indicate that the second approach relying on 75 % of the original data set provides the meta-model sets which best mimics the original model. An overall trend observed from the 75 % data demand meta-model sets...

  8. A new level set model for multimaterial flows

    Energy Technology Data Exchange (ETDEWEB)

    Starinshak, David P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Karni, Smadar [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Mathematics; Roe, Philip L. [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of AerospaceEngineering

    2014-01-08

    We present a new level set model for representing multimaterial flows in multiple space dimensions. Instead of associating a level set function with a specific fluid material, the function is associated with a pair of materials and the interface that separates them. A voting algorithm collects sign information from all level sets and determines material designations. M(M ₋1)/2 level set functions might be needed to represent a general M-material configuration; problems of practical interest use far fewer functions, since not all pairs of materials share an interface. The new model is less prone to producing indeterminate material states, i.e. regions claimed by more than one material (overlaps) or no material at all (vacuums). It outperforms existing material-based level set models without the need for reinitialization schemes, thereby avoiding additional computational costs and preventing excessive numerical diffusion.

  9. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  11. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren

    2013-01-01

    To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....

  12. Location identification for indoor instantaneous point contaminant source by probability-based inverse Computational Fluid Dynamics modeling.

    Science.gov (United States)

    Liu, X; Zhai, Z

    2008-02-01

    Indoor pollutions jeopardize human health and welfare and may even cause serious morbidity and mortality under extreme conditions. To effectively control and improve indoor environment quality requires immediate interpretation of pollutant sensor readings and accurate identification of indoor pollution history and source characteristics (e.g. source location and release time). This procedure is complicated by non-uniform and dynamic contaminant indoor dispersion behaviors as well as diverse sensor network distributions. This paper introduces a probability concept based inverse modeling method that is able to identify the source location for an instantaneous point source placed in an enclosed environment with known source release time. The study presents the mathematical models that address three different sensing scenarios: sensors without concentration readings, sensors with spatial concentration readings, and sensors with temporal concentration readings. The paper demonstrates the inverse modeling method and algorithm with two case studies: air pollution in an office space and in an aircraft cabin. The predictions were successfully verified against the forward simulation settings, indicating good capability of the method in finding indoor pollutant sources. The research lays a solid ground for further study of the method for more complicated indoor contamination problems. The method developed can help track indoor contaminant source location with limited sensor outputs. This will ensure an effective and prompt execution of building control strategies and thus achieve a healthy and safe indoor environment. The method can also assist the design of optimal sensor networks.

  13. Fuzzy Partition Models for Fitting a Set of Partitions.

    Science.gov (United States)

    Gordon, A. D.; Vichi, M.

    2001-01-01

    Describes methods for fitting a fuzzy consensus partition to a set of partitions of the same set of objects. Describes and illustrates three models defining median partitions and compares these methods to an alternative approach to obtaining a consensus fuzzy partition. Discusses interesting differences in the results. (SLD)

  14. A Comprehensive Propagation Prediction Model Comprising Microfacet Based Scattering and Probability Based Coverage Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    A. S. M. Zahid Kausar

    2014-01-01

    Full Text Available Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS and closest object finder (COF, are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.

  15. Process Setting Models for the Minimization of Costs Defectives ...

    African Journals Online (AJOL)

    Process Setting Models for the Minimization of Costs Defectives. ... Journal Home > Vol 15, No 1 (1991) >. Log in or Register to get access to full text downloads. ... Abstract. The economy of production controls all manufacturing activities. In the ...

  16. The Mathematical Concept of Set and the 'Collection' Model.

    Science.gov (United States)

    Fischbein, Efraim; Baltsan, Madlen

    1999-01-01

    Hypothesizes that various misconceptions held by students with regard to the mathematical set concept may be explained by the initial collection model. Study findings confirm the hypothesis. (Author/ASK)

  17. Robust optimization of aircraft weapon delivery trajectory using probability collectives and meta-modeling

    Institute of Scientific and Technical Information of China (English)

    Wang Nan; Shen Lincheng; Liu Hongfu; Chen Jing; Hu Tianjiang

    2013-01-01

    Conventional trajectory optimization techniques have been challenged by their inability to handle threats with irregular shapes and the tendency to be sensitive to control variations of aircraft.Aiming to overcome these difficulties,this paper presents an alternative approach for trajectory optimization,where the problem is formulated into a parametric optimization of the maneuver variables under a tactics template framework.To reduce the size of the problem,global sensitivity analysis (GSA) is performed to identify the less-influential maneuver variables.The probability collectives (PC) algorithm,which is well-suited to discrete and discontinuous optimization,is applied to solve the trajectory optimization problem.The robustness of the trajectory is assessed through multiple sampling around the chosen values of the maneuver variables.Meta-models based on radius basis function (RBF) are created for evaluations of the means and deviations of the problem objectives and constraints.To guarantee the approximation accuracy,the meta-models are adaptively updated during optimization.The proposed approach is demonstrated on a typical airground attack mission scenario.Results reveal that the proposed approach is capable of generating robust and optimal trajectories with both accuracy and efficiency.

  18. Modeling probability of additional cases of natalizumab-associated JCV sero-negative progressive multifocal leukoencephalopathy.

    Science.gov (United States)

    Carruthers, Robert L; Chitnis, Tanuja; Healy, Brian C

    2014-05-01

    JCV serologic status is used to determine PML risk in natalizumab-treated patients. Given two cases of natalizumab-associated PML in JCV sero-negative patients and two publications that question the false negative rate of the JCV serologic test, clinicians may question whether our understanding of PML risk is adequate. Given that there is no gold standard for diagnosing previous JCV exposure, the test characteristics of the JCV serologic test are unknowable. We propose a model of PML risk in JCV sero-negative natalizumab patients. Using the numbers of JCV sero-positive and -negative patients from a study of PML risk by JCV serologic status (sero-positive: 13,950 and sero-negative: 11,414), we apply a range of sensitivities and specificities in order calculate the number of JCV-exposed but JCV sero-negative patients (false negatives). We then apply a range of rates of developing PML in sero-negative patients to calculate the expected number of PML cases. By using the binomial function, we calculate the probability of a given number of JCV sero-negative PML cases. With this model, one has a means to establish a threshold number of JCV sero-negative natalizumab-associated PML cases at which it is improbable that our understanding of PML risk in JCV sero-negative patients is adequate.

  19. Characteristics of the probability function for three random-walk models of reaction--diffusion processes

    Energy Technology Data Exchange (ETDEWEB)

    Musho, M.K.; Kozak, J.J.

    1984-10-01

    A method is presented for calculating exactly the relative width (sigma/sup 2/)/sup 1/2//, the skewness ..gamma../sub 1/, and the kurtosis ..gamma../sub 2/ characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere (P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)) is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes.

  20. Model assisted probability of detection for a guided waves based SHM technique

    Science.gov (United States)

    Memmolo, V.; Ricci, F.; Maio, L.; Boffa, N. D.; Monaco, E.

    2016-04-01

    Guided wave (GW) Structural Health Monitoring (SHM) allows to assess the health of aerostructures thanks to the great sensitivity to delamination and/or debondings appearance. Due to the several complexities affecting wave propagation in composites, an efficient GW SHM system requires its effective quantification associated to a rigorous statistical evaluation procedure. Probability of Detection (POD) approach is a commonly accepted measurement method to quantify NDI results and it can be effectively extended to an SHM context. However, it requires a very complex setup arrangement and many coupons. When a rigorous correlation with measurements is adopted, Model Assisted POD (MAPOD) is an efficient alternative to classic methods. This paper is concerned with the identification of small emerging delaminations in composite structural components. An ultrasonic GW tomography focused to impact damage detection in composite plate-like structures recently developed by authors is investigated, getting the bases for a more complex MAPOD analysis. Experimental tests carried out on a typical wing composite structure demonstrated the effectiveness of modeling approach in order to detect damages with the tomographic algorithm. Environmental disturbances, which affect signal waveforms and consequently damage detection, are considered simulating a mathematical noise in the modeling stage. A statistical method is used for an effective making decision procedure. A Damage Index approach is implemented as metric to interpret the signals collected from a distributed sensor network and a subsequent graphic interpolation is carried out to reconstruct the damage appearance. A model validation and first reliability assessment results are provided, in view of performance system quantification and its optimization as well.

  1. Fuzzy GML Modeling Based on Vague Soft Sets

    Directory of Open Access Journals (Sweden)

    Bo Wei

    2017-01-01

    Full Text Available The Open Geospatial Consortium (OGC Geography Markup Language (GML explicitly represents geographical spatial knowledge in text mode. All kinds of fuzzy problems will inevitably be encountered in spatial knowledge expression. Especially for those expressions in text mode, this fuzziness will be broader. Describing and representing fuzziness in GML seems necessary. Three kinds of fuzziness in GML can be found: element fuzziness, chain fuzziness, and attribute fuzziness. Both element fuzziness and chain fuzziness belong to the reflection of the fuzziness between GML elements and, then, the representation of chain fuzziness can be replaced by the representation of element fuzziness in GML. On the basis of vague soft set theory, two kinds of modeling, vague soft set GML Document Type Definition (DTD modeling and vague soft set GML schema modeling, are proposed for fuzzy modeling in GML DTD and GML schema, respectively. Five elements or pairs, associated with vague soft sets, are introduced. Then, the DTDs and the schemas of the five elements are correspondingly designed and presented according to their different chains and different fuzzy data types. While the introduction of the five elements or pairs is the basis of vague soft set GML modeling, the corresponding DTD and schema modifications are key for implementation of modeling. The establishment of vague soft set GML enables GML to represent fuzziness and solves the problem of lack of fuzzy information expression in GML.

  2. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  3. Developing an Integrated Set of Production Planning and Control Models

    OpenAIRE

    Wang, Hui

    2012-01-01

    This paper proposes an integrated set of production planning and control models that can be applied in the Push system (Make-to-stock). The integrated model include forecasting, aggregate planning, materials requirements planning, inventory control, capacity planning and scheduling. This integrated model solves the planning issues via three levels, which include strategic level, tactical level and operational level. The model obtains the optimal production plan for each product type in each p...

  4. Modeling of Plutonium Ionization Probabilities for Use in Nuclear Forensic Analysis by Resonance Ionization Mass Spectrometry

    Science.gov (United States)

    2016-12-01

    material and chemical composition within the sample. This data can then be included in analysis by law enforcement and intelligence agencies to...PLUTONIUM IONIZATION PROBABILITIES FOR USE IN NUCLEAR FORENSIC ANALYSIS BY RESONANCE IONIZATION MASS SPECTROMETRY by Steven F. Hutchinson...IONIZATION PROBABILITIES FOR USE IN NUCLEAR FORENSIC ANALYSIS BY RESONANCE IONIZATION MASS SPECTROMETRY 5. FUNDING NUMBERS 6. AUTHOR(S) Steven F

  5. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T; Samorodnitsky, G

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  6. The Finite-time Ruin Probability for the Jump-Diffusion Model with Constant Interest Force

    Institute of Scientific and Technical Information of China (English)

    Tao Jiang; Hai-feng Yan

    2006-01-01

    In this paper, we consider the finite-time ruin probability for the jump-diffusion Poisson process.Under the assumptions that the claimsizes are subexponentially distributed and that the interest force is constant, we obtain an asymptotic formula for the finite-time ruin probability. The results we obtain extends the

  7. Social Networks and Choice Set Formation in Discrete Choice Models

    Directory of Open Access Journals (Sweden)

    Bruno Wichmann

    2016-10-01

    Full Text Available The discrete choice literature has evolved from the analysis of a choice of a single item from a fixed choice set to the incorporation of a vast array of more complex representations of preferences and choice set formation processes into choice models. Modern discrete choice models include rich specifications of heterogeneity, multi-stage processing for choice set determination, dynamics, and other elements. However, discrete choice models still largely represent socially isolated choice processes —individuals are not affected by the preferences of choices of other individuals. There is a developing literature on the impact of social networks on preferences or the utility function in a random utility model but little examination of such processes for choice set formation. There is also emerging evidence in the marketplace of the influence of friends on choice sets and choices. In this paper we develop discrete choice models that incorporate formal social network structures into the choice set formation process in a two-stage random utility framework. We assess models where peers may affect not only the alternatives that individuals consider or include in their choice sets, but also consumption choices. We explore the properties of our models and evaluate the extent of “errors” in assessment of preferences, economic welfare measures and market shares if network effects are present, but are not accounted for in the econometric model. Our results shed light on the importance of the evaluation of peer or network effects on inclusion/exclusion of alternatives in a random utility choice framework.

  8. Robust non-rigid point set registration using student's-t mixture model.

    Directory of Open Access Journals (Sweden)

    Zhiyong Zhou

    Full Text Available The Student's-t mixture model, which is heavily tailed and more robust than the Gaussian mixture model, has recently received great attention on image processing. In this paper, we propose a robust non-rigid point set registration algorithm using the Student's-t mixture model. Specifically, first, we consider the alignment of two point sets as a probability density estimation problem and treat one point set as Student's-t mixture model centroids. Then, we fit the Student's-t mixture model centroids to the other point set which is treated as data. Finally, we get the closed-form solutions of registration parameters, leading to a computationally efficient registration algorithm. The proposed algorithm is especially effective for addressing the non-rigid point set registration problem when significant amounts of noise and outliers are present. Moreover, less registration parameters have to be set manually for our algorithm compared to the popular coherent points drift (CPD algorithm. We have compared our algorithm with other state-of-the-art registration algorithms on both 2D and 3D data with noise and outliers, where our non-rigid registration algorithm showed accurate results and outperformed the other algorithms.

  9. A probabilistic framework for microarray data analysis: fundamental probability models and statistical inference.

    Science.gov (United States)

    Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S

    2010-05-21

    Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays.

  10. A Novel Adaptive Conditional Probability-Based Predicting Model for User’s Personality Traits

    Directory of Open Access Journals (Sweden)

    Mengmeng Wang

    2015-01-01

    Full Text Available With the pervasive increase in social media use, the explosion of users’ generated data provides a potentially very rich source of information, which plays an important role in helping online researchers understand user’s behaviors deeply. Since user’s personality traits are the driving force of user’s behaviors, hence, in this paper, along with social network features, we first extract linguistic features, emotional statistical features, and topic features from user’s Facebook status updates, followed by quantifying importance of features via Kendall correlation coefficient. And then, on the basis of weighted features and dynamic updated thresholds of personality traits, we deploy a novel adaptive conditional probability-based predicting model which considers prior knowledge of correlations between user’s personality traits to predict user’s Big Five personality traits. In the experimental work, we explore the existence of correlations between user’s personality traits which provides a better theoretical support for our proposed method. Moreover, on the same Facebook dataset, compared to other methods, our method can achieve an F1-measure of 80.6% when taking into account correlations between user’s personality traits, and there is an impressive improvement of 5.8% over other approaches.

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Typhoon disaster zoning and prevention criteria——A double layer nested multi-objective probability model and its application

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    For prevention and mitigation of typhoon disasters in China, in this paper a double layer nested multi-objective probability model of typhoon disaster zoning and prevention criteria is proposed. The multivariate compound extreme value distribution (MCEVD) is used to predict the joint probability of seven typhoon characteristics and corresponding typhoon induced disasters. Predicted results can be used for both typhoon disaster zoning and corresponding prevention criteria along China coast.

  13. Typhoon disaster zoning and prevention criteria——A double layer nested multiobjective probability model and its application

    Institute of Scientific and Technical Information of China (English)

    LIU DeFu; PANG Liang; XIE BoTao; WU YuanKang

    2008-01-01

    For prevention and mitigation of typhoon disasters in China,in this paper a double layer nested multi-objective probability model of typhoon disaster zoning and prevention criteria is proposed.The multivariate compound extreme value distri-bution (MCEVD) is used to predict the joint probability of seven typhoon charac-teristics and corresponding typhoon induced disasters.Predicted results can be used for both typhoon disaster zoning and corresponding prevention criteria along China coast.

  14. Modeling tumor control probability for spatially inhomogeneous risk of failure based on clinical outcome data.

    Science.gov (United States)

    Lühr, Armin; Löck, Steffen; Jakobi, Annika; Stützer, Kristin; Bandurska-Luque, Anna; Vogelius, Ivan Richter; Enghardt, Wolfgang; Baumann, Michael; Krause, Mechthild

    2017-07-01

    Objectives of this work are (1) to derive a general clinically relevant approach to model tumor control probability (TCP) for spatially variable risk of failure and (2) to demonstrate its applicability by estimating TCP for patients planned for photon and proton irradiation. The approach divides the target volume into sub-volumes according to retrospectively observed spatial failure patterns. The product of all sub-volume TCPi values reproduces the observed TCP for the total tumor. The derived formalism provides for each target sub-volume i the tumor control dose (D50,i) and slope (γ50,i) parameters at 50% TCPi. For a simultaneous integrated boost (SIB) prescription for 45 advanced head and neck cancer patients, TCP values for photon and proton irradiation were calculated and compared. The target volume was divided into gross tumor volume (GTV), surrounding clinical target volume (CTV), and elective CTV (CTVE). The risk of a local failure in each of these sub-volumes was taken from the literature. Convenient expressions for D50,i and γ50,i were provided for the Poisson and the logistic model. Comparable TCP estimates were obtained for photon and proton plans of the 45 patients using the sub-volume model, despite notably higher dose levels (on average +4.9%) in the low-risk CTVE for photon irradiation. In contrast, assuming a homogeneous dose response in the entire target volume resulted in TCP estimates contradicting clinical experience (the highest failure rate in the low-risk CTVE) and differing substantially between photon and proton irradiation. The presented method is of practical value for three reasons: It (a) is based on empirical clinical outcome data; (b) can be applied to non-uniform dose prescriptions as well as different tumor entities and dose-response models; and (c) is provided in a convenient compact form. The approach may be utilized to target spatial patterns of local failures observed in patient cohorts by prescribing different doses to

  15. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  16. Model checking abstract state machines with answer set programming

    OpenAIRE

    2006-01-01

    Answer Set Programming (ASP) is a logic programming paradigm that has been shown as a useful tool in various application areas due to its expressive modelling language. These application areas include Bourided Model Checking (BMC). BMC is a verification technique that is recognized for its strong ability of finding errors in computer systems. To apply BMC, a system needs to be modelled in a formal specification language, such as the widely used formalism of Abstract State Machines (ASMs). In ...

  17. Ruin Probabilities in Cox Risk Models with Two Dependent Classes of Business

    Institute of Scientific and Technical Information of China (English)

    Jun Yi GUO; Kam C. YUEN; Ming ZHOU

    2007-01-01

    In this paper we consider risk processes with two classes of business in which the two claim-number processes are dependent Cox processes. We first assume that the two claim-number processes have a two-dimensional Markovian intensity. Under this assumption, we not only study the sum of the two individual risk processes but also investigate the two-dimensional risk process formed by considering the two individual processes separately. For each of the two risk processes we derive an expression for the ruin probability, and then construct an upper bound for the ruin probability.We next assume that the intensity of the two claim-number processes follows a Markov chain. In this case, we examine the ruin probability of the sum of the two individual risk processes. Specifically, a differential system for the ruin probability is derived and numerical results are obtained for exponential claim sizes.

  18. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...

  19. Probability Criterion for a Dynamic Financial Model with Short-Selling Allowed

    Institute of Scientific and Technical Information of China (English)

    韩其恒; 唐万生; 李光泉

    2003-01-01

    Probability criterion has its practical significance, and its investment decision-making is determined by the expected discounted wealth. In a complete, standard financial market with short-selling allowed, this paper probes into the investment decision-making with probability criterion. The upper limit of criterion function is obtained. The corresponding discounted wealth process and hedging portfolio process are provided. Finally, an illustrative example of one-dimensional constant-coefficient financial market is given.

  20. Evaluating the effect of corridors and landscape heterogeneity on dispersal probability: a comparison of three spatially explicit modelling approaches

    DEFF Research Database (Denmark)

    Jepsen, J. U.; Baveco, J. M.; Topping, C. J.

    2004-01-01

    or populations in space given a specific configuration of habitat patches. We evaluated how the choice of model influenced predictions regarding the effect of patch- and corridor configuration on dispersal probabilities and the number of successful immigrants of a simulated small mammal. Model results were...... analysed both at the level of the entire habitat network and at the level of individual patches....

  1. Modeller subjectivity in estimating pesticide parameters for leaching models using the same laboratory data set

    NARCIS (Netherlands)

    Boesten, J.J.T.I.

    2000-01-01

    User-dependent subjectivity in the process of testing pesticide leaching models is relevant because it may result in wrong interpretation of model tests. About 20 modellers used the same data set to test pesticide leaching models (one or two models per modeller). The data set included laboratory stu

  2. Modeller subjectivity in estimating pesticide parameters for leaching models using the same laboratory data set

    NARCIS (Netherlands)

    Boesten, J.J.T.I.

    2000-01-01

    User-dependent subjectivity in the process of testing pesticide leaching models is relevant because it may result in wrong interpretation of model tests. About 20 modellers used the same data set to test pesticide leaching models (one or two models per modeller). The data set included laboratory

  3. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  4. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  5. Optimisation-Based Solution Methods for Set Partitioning Models

    DEFF Research Database (Denmark)

    Rasmussen, Matias Sevel

    The scheduling of crew, i.e. the construction of work schedules for crew members, is often not a trivial task, but a complex puzzle. The task is complicated by rules, restrictions, and preferences. Therefore, manual solutions as well as solutions from standard software packages are not always su......_cient with respect to solution quality and solution time. Enhancement of the overall solution quality as well as the solution time can be of vital importance to many organisations. The _elds of operations research and mathematical optimisation deal with mathematical modelling of di_cult scheduling problems (among...... other topics). The _elds also deal with the development of sophisticated solution methods for these mathematical models. This thesis describes the set partitioning model which has been widely used for modelling crew scheduling problems. Integer properties for the set partitioning model are shown...

  6. Setting of Agricultural Insurance Premium Rate and the Adjustment Model

    Institute of Scientific and Technical Information of China (English)

    HUANG Ya-lin

    2012-01-01

    First,using the law of large numbers,I analyze the setting principle of agricultural insurance premium rate,and take the case of setting of adult sow premium rate for study,to draw the conclusion that with the continuous promotion of agricultural insurance,increase in the types of agricultural insurance and increase in the number of the insured,the premium rate should also be adjusted opportunely.Then,on the basis of Bayes’ theorem,I adjust and calibrate the claim frequency and the average claim,in order to correctly adjust agricultural insurance premium rate;take the case of forest insurance for premium rate adjustment analysis.In setting and adjustment of agricultural insurance premium rate,in order to make the expected results well close to the real results,it is necessary to apply the probability estimates in a large number of risk units;focus on the establishment of agricultural risk database,to timely adjust agricultural insurance premium rate.

  7. Fate modelling of chemical compounds with incomplete data sets

    DEFF Research Database (Denmark)

    Birkved, Morten; Heijungs, Reinout

    2011-01-01

    in an approximate way. The idea is that not all data needed in a multi-media fate and exposure model are completely independent and equally important, but that there are physical-chemical and biological relationships between sets of chemical properties. A statistical model is constructed to underpin this assumption......, and to provide simplified proxies for the more complicated “real”model relationships. In the presented study two approaches for the reduction of the data demand associated with characterization of chemical emissions in USEtoxTM are tested: The first approach yields a simplified set of mode of entry specific meta......-models with a data demand of app. 63 % (5/8) of the USEtoxTM characterization model. The second yields a simplified set of mode of entry specific meta-models with a data demand of 75 % (6/8) of the original model. The results of the study indicate that it is possible to simplify characterization models and lower...

  8. Backward probability model using multiple observations of contamination to identify groundwater contamination sources at the Massachusetts Military Reservation

    Science.gov (United States)

    Neupauer, R. M.; Wilson, J. L.

    2005-02-01

    Backward location and travel time probability density functions characterize the possible former locations (or the source location) of contamination that is observed in an aquifer. For an observed contaminant particle the backward location probability density function (PDF) describes its position at a fixed time prior to sampling, and the backward travel time probability density function describes the amount of time required for the particle to travel to the sampling location from a fixed upgradient position. The backward probability model has been developed for a single observation of contamination (e.g., Neupauer and Wilson, 1999). In practical situations, contamination is sampled at multiple locations and times, and these additional data provide information that can be used to better characterize the former position of contamination. Through Bayes' theorem we combine the individual PDFs for each observation to obtain a PDF for multiple observations that describes the possible source locations or release times of all observed contaminant particles, assuming they originated from the same instantaneous point source. We show that the multiple-observation probability density function is the normalized product of the single-observation PDFs. The additional information available from multiple observations reduces the variances of the source location and travel time probability density functions and improves the characterization of the contamination source. We apply the backward probability model to a trichloroethylene (TCE) plume at the Massachusetts Military Reservation (MMR). We use four TCE samples distributed throughout the plume to obtain single-observation and multiple-observation location and travel time PDFs in three dimensions. These PDFs provide information about the possible sources of contamination. Under assumptions that the existing MMR model is properly calibrated and the conceptual model is correct the results confirm the two suspected sources of

  9. The Numerical Modeling of Transient Regimes of Diesel Generator Sets

    Directory of Open Access Journals (Sweden)

    Cristian Roman

    2010-07-01

    Full Text Available This paper deals with the numerical modeling of a diesel generator set used as amain energy source in isolated areas and as a back-up energy source in the case ofrenewable energy systems. The numerical models are developed using a Matlab/Simulinksoftware package and they prove to be a powerful tool for the computer aided design ofcomplex hybrid power systems. Several operation regimes of the equipment are studied.The numerical study is completed with experimental measurements on a Kipor type dieselelectricgenerator set.

  10. Using Set Model for Learning Addition of Integers

    Directory of Open Access Journals (Sweden)

    Umi Puji Lestari

    2015-07-01

    Full Text Available This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the use of set models that is packaged in activity of recording of financial transactions in two color chips and card game can help students to understand the concept of zero pair, addition with the same colored chips, and cancellation strategy.

  11. Shape-intensity prior level set combining probabilistic atlas and probability map constrains for automatic liver segmentation from abdominal CT images.

    Science.gov (United States)

    Wang, Jinke; Cheng, Yuanzhi; Guo, Changyong; Wang, Yadong; Tamura, Shinichi

    2016-05-01

    Propose a fully automatic 3D segmentation framework to segment liver on challenging cases that contain the low contrast of adjacent organs and the presence of pathologies from abdominal CT images. First, all of the atlases are weighted in the selected training datasets by calculating the similarities between the atlases and the test image to dynamically generate a subject-specific probabilistic atlas for the test image. The most likely liver region of the test image is further determined based on the generated atlas. A rough segmentation is obtained by a maximum a posteriori classification of probability map, and the final liver segmentation is produced by a shape-intensity prior level set in the most likely liver region. Our method is evaluated and demonstrated on 25 test CT datasets from our partner site, and its results are compared with two state-of-the-art liver segmentation methods. Moreover, our performance results on 10 MICCAI test datasets are submitted to the organizers for comparison with the other automatic algorithms. Using the 25 test CT datasets, average symmetric surface distance is [Formula: see text] mm (range 0.62-2.12 mm), root mean square symmetric surface distance error is [Formula: see text] mm (range 0.97-3.01 mm), and maximum symmetric surface distance error is [Formula: see text] mm (range 12.73-26.67 mm) by our method. Our method on 10 MICCAI test data sets ranks 10th in all the 47 automatic algorithms on the site as of July 2015. Quantitative results, as well as qualitative comparisons of segmentations, indicate that our method is a promising tool to improve the efficiency of both techniques. The applicability of the proposed method to some challenging clinical problems and the segmentation of the liver are demonstrated with good results on both quantitative and qualitative experimentations. This study suggests that the proposed framework can be good enough to replace the time-consuming and tedious slice-by-slice manual

  12. Localization of compact invariant sets of the Lorenz' 1984 model

    Science.gov (United States)

    Starkov, K. E.

    In 1984 E. Lorenz published a paper [1] in which he proposed "the simplest possible general circulation model": dot{x} = -y^2 - z^2 - ax + aF, dot{y} = xy -bxz - y+G, dot{z} = bxy + xz -z which is referred to as the Lorenz'1984 model. The existence of chaos was shown in [1, 2] for different values of parameters. Dynamical studies of this system were realized in papers [1, 2]; [3], [4]. This paper is devoted to study of a localization problem of compact invariant sets of the Lorenz'1984 model with help of one approach elaborated in papers of Krishchenko and Starkov, see e.g. [5]. This problem is an important topic in studies of dynamics of a chaotic system because of the interest to a long-time behavior of a system. In this work we establish that all compact invariant sets of the Lorenz' 1984 model are contained in the set \\{ x le F;x^2 + y^2 + z^2 le η ^2 = {2left( {a + 2} right)F^2 + 3G^2 + 2Gsqrt {aF^2 + G^2 } }/4\\} . Further, we improve this localization with help of refining bound η using additional localizations sets. By applying cylindrical coordinates to the Lorenz' 1984 model we derive yet another localization set of the form \\{ y^2 + z^2 le G^2 (1 + b^{ - 2} )exp (4π b^{ - 1} )\\}. Finally, we discuss how to improve the final localization set and consider one example.

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Application of the probability-based covering algorithm model in text classification

    Institute of Scientific and Technical Information of China (English)

    ZHOU; Ying

    2009-01-01

    The probability-based covering algorithm(PBCA)is a new algorithm based on probability distribution.It decides,by voting,the class of the tested samples on the border of the coverage area,based on the probability of training samples.When using the original covering algorithm(CA),many tested samples that are located on the border of the coverage cannot be classified by the spherical neighborhood gained.The network structure of PBCA is a mixed structure composed of both a feed-forward network and a feedback network.By using this method of adding some heterogeneous samples and enlarging the coverage radius,it is possible to decrease the number of rejected samples and improve the rate of recognition accuracy.Relevant computer experiments indicate that the algorithm improves the study precision and achieves reasonably good results in text classification.

  15. Use of ELVIS II platform for random process modelling and analysis of its probability density function

    Science.gov (United States)

    Maslennikova, Yu. S.; Nugmanov, I. S.

    2016-08-01

    The problem of probability density function estimation for a random process is one of the most common in practice. There are several methods to solve this problem. Presented laboratory work uses methods of the mathematical statistics to detect patterns in the realization of random process. On the basis of ergodic theory, we construct algorithm for estimating univariate probability density distribution function for a random process. Correlational analysis of realizations is applied to estimate the necessary size of the sample and the time of observation. Hypothesis testing for two probability distributions (normal and Cauchy) is used on the experimental data, using χ2 criterion. To facilitate understanding and clarity of the problem solved, we use ELVIS II platform and LabVIEW software package that allows us to make the necessary calculations, display results of the experiment and, most importantly, to control the experiment. At the same time students are introduced to a LabVIEW software package and its capabilities.

  16. An analytical model for evaluating outage and handover probability of cellular wireless networks

    CERN Document Server

    Decreusefond, Laurent; Vu, Than-Tung

    2010-01-01

    We consider stochastic cellular networks where base stations locations form a homogenous Poisson point process and each mobile is attached to the base station that provides the best mean signal power. The mobile is in outage if the SINR falls below some threshold. The handover decision has to be made if the mobile is in outage for some time slots. The outage probability and the handover probability is evaluated in taking into account the effect of path loss, shadowing, Rayleigh fast fading, frequency factor reuse and conventional beamforming. The main assumption is that the Rayleigh fast fading changes each time slot while other network components remain static during the period of study.

  17. A new level set model for cell image segmentation

    Science.gov (United States)

    Ma, Jing-Feng; Hou, Kai; Bao, Shang-Lian; Chen, Chun

    2011-02-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.

  18. A new level set model for cell image segmentation

    Institute of Scientific and Technical Information of China (English)

    Ma Jing-Feng; Hou Kai; Bao Shang-Lian; Chen Chun

    2011-01-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.

  19. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  20. The bilateral trade model in a discrete setting

    NARCIS (Netherlands)

    Flesch, J.; Schröder, M.J.W.; Vermeulen, A.J.

    2013-01-01

    We consider a bilateral trade model in which both players have a finite number of possible valuations. The seller's valuation and the buyer's valuation for the object are private information, but the independent beliefs about these valuations are common knowledge. In this setting, we provide a

  1. Modelling fruit set, fruit growth and dry matter partitioning

    NARCIS (Netherlands)

    Marcelis, L.F.M.; Heuvelink, E.

    1999-01-01

    This paper discusses how fruit set, fruit growth and dry matter partitioning can be simulated by models where sink strength (assimilate demand) and source strength (assimilate supply) are the key variables. Although examples are derived from experiments on fruit vegetables such as tomato, sweet pepp

  2. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate emp...

  3. A preference-based multiple-source rough set model

    NARCIS (Netherlands)

    M.A. Khan; M. Banerjee

    2010-01-01

    We propose a generalization of Pawlak’s rough set model for the multi-agent situation, where information from an agent can be preferred over that of another agent of the system while deciding membership of objects. Notions of lower/upper approximations are given which depend on the knowledge base of

  4. Numerical renormalization group study of probability distributions for local fluctuations in the Anderson-Holstein and Holstein-Hubbard models.

    Science.gov (United States)

    Hewson, Alex C; Bauer, Johannes

    2010-03-24

    We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density ρ(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.

  5. A fuzzy set preference model for market share analysis

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share

  6. Random forest models for the probable biological condition of streams and rivers in the USA

    Science.gov (United States)

    The National Rivers and Streams Assessment (NRSA) is a probability based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...

  7. A Computational Model of Word Segmentation from Continuous Speech Using Transitional Probabilities of Atomic Acoustic Events

    Science.gov (United States)

    Rasanen, Okko

    2011-01-01

    Word segmentation from continuous speech is a difficult task that is faced by human infants when they start to learn their native language. Several studies indicate that infants might use several different cues to solve this problem, including intonation, linguistic stress, and transitional probabilities between subsequent speech sounds. In this…

  8. Twenty-four hour predictions of the solar wind speed peaks by the probability distribution function model

    Science.gov (United States)

    Bussy-Virat, C. D.; Ridley, A. J.

    2016-10-01

    Abrupt transitions from slow to fast solar wind represent a concern for the space weather forecasting community. They may cause geomagnetic storms that can eventually affect systems in orbit and on the ground. Therefore, the probability distribution function (PDF) model was improved to predict enhancements in the solar wind speed. New probability distribution functions allow for the prediction of the peak amplitude and the time to the peak while providing an interval of uncertainty on the prediction. It was found that 60% of the positive predictions were correct, while 91% of the negative predictions were correct, and 20% to 33% of the peaks in the speed were found by the model. This represents a considerable improvement upon the first version of the PDF model. A direct comparison with the Wang-Sheeley-Arge model shows that the PDF model is quite similar, except that it leads to fewer false positive predictions and misses fewer events, especially when the peak reaches very high speeds.

  9. Developing a Mathematical Model for Scheduling and Determining Success Probability of Research Projects Considering Complex-Fuzzy Networks

    Directory of Open Access Journals (Sweden)

    Gholamreza Norouzi

    2015-01-01

    Full Text Available In project management context, time management is one of the most important factors affecting project success. This paper proposes a new method to solve research project scheduling problems (RPSP containing Fuzzy Graphical Evaluation and Review Technique (FGERT networks. Through the deliverables of this method, a proper estimation of project completion time (PCT and success probability can be achieved. So algorithms were developed to cover all features of the problem based on three main parameters “duration, occurrence probability, and success probability.” These developed algorithms were known as PR-FGERT (Parallel and Reversible-Fuzzy GERT networks. The main provided framework includes simplifying the network of project and taking regular steps to determine PCT and success probability. Simplifications include (1 equivalent making of parallel and series branches in fuzzy network considering the concepts of probabilistic nodes, (2 equivalent making of delay or reversible-to-itself branches and impact of changing the parameters of time and probability based on removing related branches, (3 equivalent making of simple and complex loops, and (4 an algorithm that was provided to resolve no-loop fuzzy network, after equivalent making. Finally, the performance of models was compared with existing methods. The results showed proper and real performance of models in comparison with existing methods.

  10. Producing a Set of Models for the Iron Homeostasis Network

    Directory of Open Access Journals (Sweden)

    Nicolas Mobilia

    2013-08-01

    Full Text Available This paper presents a method for modeling biological systems which combines formal techniques on intervals, numerical simulations and satisfaction of Signal Temporal Logic (STL formulas. The main modeling challenge addressed by this approach is the large uncertainty in the values of the parameters due to the experimental difficulties of getting accurate biological data. This method considers intervals for each parameter and a formal description of the expected behavior of the model. In a first step, it produces reduced intervals of possible parameter values. Then by performing a systematic search in these intervals, it defines sets of parameter values used in the next step. This procedure aims at finding a sub-space where the model robustly behaves as expected. We apply this method to the modeling of the cellular iron homeostasis network in erythroid progenitors. The produced model describes explicitly the regulation mechanism which acts at the translational level.

  11. A mesoscopic network model for permanent set in crosslinked elastomers

    Energy Technology Data Exchange (ETDEWEB)

    Weisgraber, T H; Gee, R H; Maiti, A; Clague, D S; Chinn, S; Maxwell, R S

    2009-01-29

    A mesoscopic computational model for polymer networks and composites is developed as a coarse-grained representation of the composite microstructure. Unlike more complex molecular dynamics simulations, the model only considers the effects of crosslinks on mechanical behavior. The elastic modulus, which depends only on the crosslink density and parameters in the bond potential, is consistent with rubber elasticity theory, and the network response satisfies the independent network hypothesis of Tobolsky. The model, when applied to a commercial filled silicone elastomer, quantitatively reproduces the experimental permanent set and stress-strain response due to changes in the crosslinked network from irradiation.

  12. Modelling spatial vagueness based on type-2 fuzzy set

    Institute of Scientific and Technical Information of China (English)

    DU Guo-ning; ZHU Zhong-ying

    2006-01-01

    The modelling and formal characterization of spatial vagueness plays an increasingly important role in the implementation of Geographic Information System (GIS). The concepts involved in spatial objects of GIS have been investigated and acknowledged as being vague and ambiguous. Models and methods which describe and handle fuzzy or vague (rather than crisp or determinate) spatial objects, will be more necessary in GIS. This paper proposes a new method for modelling spatial vagueness based on type-2 fuzzy set, which is distinguished from the traditional type-1 fuzzy methods and more suitable for describing and implementing the vague concepts and objects in GIS.

  13. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  14. A probability model for enterotoxin production of Bacillus cereus as a function of pH and temperature.

    Science.gov (United States)

    Ding, Tian; Wang, Jun; Park, Myoung-Su; Hwang, Cheng-An; Oh, Deog-Hwan

    2013-02-01

    Bacillus cereus is frequently isolated from a variety of foods, including vegetables, dairy products, meats, and other raw and processed foods. The bacterium is capable of producing an enterotoxin and emetic toxin that can cause severe nausea, vomiting, and diarrhea. The objectives of this study were to assess and model the probability of enterotoxin production of B. cereus in a broth model as affected by the broth pH and storage temperature. A three-strain mixture of B. cereus was inoculated in tryptic soy broth adjusted to pH 5.0, 6.0, 7.2, 8.0, and 8.5, and the samples were stored at 15, 20, 25, 30, and 35°C for 24 h. A total of 25 combinations of pH and temperature, each with 10 samples, were tested. The presence of enterotoxin in broth was assayed using a commercial test kit. The probabilities of positive enterotoxin production in 25 treatments were fitted with a logistic regression to develop a probability model to describe the probability of toxin production as a function of pH and temperature. The resulting model showed that the probabilities of enterotoxin production of B. cereus in broth increased as the temperature increased and/or as the broth pH approached 7.0. The model described the experimental data satisfactorily and identified the boundary of pH and temperature for the production of enterotoxin. The model could provide information for assessing the food poisoning risk associated with enterotoxins of B. cereus and for the selection of product pH and storage temperature for foods to reduce the hazards associated with B. cereus.

  15. External validation of a normal tissue complication probability model for radiation-induced hypothyroidism in an independent cohort

    DEFF Research Database (Denmark)

    Rønjom, Marianne F; Brink, Carsten; Bentzen, Søren M

    2015-01-01

    BACKGROUND: A normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism (RIHT) was previously derived in patients with squamous cell carcinoma of the head and neck (HNSCC) discerning thyroid volume (Vthyroid), mean thyroid dose (Dmean), and latency as predictive...

  16. Path Loss, Shadow Fading, and Line-Of-Sight Probability Models for 5G Urban Macro-Cellular Scenarios

    DEFF Research Database (Denmark)

    Sun, Shu; Thomas, Timothy; Rappaport, Theodore S.

    2015-01-01

    This paper presents key parameters including the line-of-sight (LOS) probability, large-scale path loss, and shadow fading models for the design of future fifth generation (5G) wireless communication systems in urban macro-cellular (UMa) scenarios, using the data obtained from propagation measure...

  17. Modeling the effect of water activity, pH, and temperature on the probability of enterotoxin production by Staphylococcus aureus

    Science.gov (United States)

    Staphylococcus aureus is a foodborne pathogen widespread in the environment and found in various food products. This pathogen can produce enterotoxins that cause illnesses in humans. The objectives of this study were to develop a probability model of S. aureus enterotoxin production as affected by w...

  18. Optimal selection for BRCA1 and BRCA2 mutation testing using a combination of 'easy to apply' probability models.

    NARCIS (Netherlands)

    Bodmer, D.; Ligtenberg, M.J.L.; Hout, A.H. van der; Gloudemans, S.; Ansink, K.; Oosterwijk-Wakka, J.C.; Hoogerbrugge-van der Linden, N.

    2006-01-01

    To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models. In a retrospective study of 263 families with breast and/or ovarian cancer patients, the utility of

  19. Optimal selection for BRCA1 and BRCA2 mutation testing using a combination of ' easy to apply ' probability models

    NARCIS (Netherlands)

    Bodmer, D.; Ligtenberg, M. J. L.; van der Hout, A. H.; Gloudemans, S.; Ansink, K.; Oosterwijk, J. C.; Hoogerbrugge, N.

    2006-01-01

    To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models. In a retrospective study of 263 families with breast and/or ovarian cancer patients, the utility of

  20. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  1. Managing Information Uncertainty in Wave Height Modeling for the Offshore Structural Analysis through Random Set

    Directory of Open Access Journals (Sweden)

    Keqin Yan

    2017-01-01

    Full Text Available This chapter presents a reliability study for an offshore jacket structure with emphasis on the features of nonconventional modeling. Firstly, a random set model is formulated for modeling the random waves in an ocean site. Then, a jacket structure is investigated in a pushover analysis to identify the critical wave direction and key structural elements. This is based on the ultimate base shear strength. The selected probabilistic models are adopted for the important structural members and the wave direction is specified in the weakest direction of the structure for a conservative safety analysis. The wave height model is processed in a P-box format when it is used in the numerical analysis. The models are applied to find the bounds of the failure probabilities for the jacket structure. The propagation of this wave model to the uncertainty in results is investigated in both an interval analysis and Monte Carlo simulation. The results are compared in context of information content and numerical accuracy. Further, the failure probability bounds are compared with the conventional probabilistic approach.

  2. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2011-03-01

    Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.

  5. Dynamic Proportional Reinsurance and Approximations for Ruin Probabilities in the Two-Dimensional Compound Poisson Risk Model

    Directory of Open Access Journals (Sweden)

    Yan Li

    2012-01-01

    Full Text Available We consider the dynamic proportional reinsurance in a two-dimensional compound Poisson risk model. The optimization in the sense of minimizing the ruin probability which is defined by the sum of subportfolio is being ruined. Via the Hamilton-Jacobi-Bellman approach we find a candidate for the optimal value function and prove the verification theorem. In addition, we obtain the Lundberg bounds and the Cramér-Lundberg approximation for the ruin probability and show that as the capital tends to infinity, the optimal strategies converge to the asymptotically optimal constant strategies. The asymptotic value can be found by maximizing the adjustment coefficient.

  6. A mechanical model for predicting the probability of osteoporotic hip fractures based in DXA measurements and finite element simulation

    Directory of Open Access Journals (Sweden)

    López Enrique

    2012-11-01

    Full Text Available Abstract Background Osteoporotic hip fractures represent major cause of disability, loss of quality of life and even mortality among the elderly population. Decisions on drug therapy are based on the assessment of risk factors for fracture, from BMD measurements. The combination of biomechanical models with clinical studies could better estimate bone strength and supporting the specialists in their decision. Methods A model to assess the probability of fracture, based on the Damage and Fracture Mechanics has been developed, evaluating the mechanical magnitudes involved in the fracture process from clinical BMD measurements. The model is intended for simulating the degenerative process in the skeleton, with the consequent lost of bone mass and hence the decrease of its mechanical resistance which enables the fracture due to different traumatisms. Clinical studies were chosen, both in non-treatment conditions and receiving drug therapy, and fitted to specific patients according their actual BMD measures. The predictive model is applied in a FE simulation of the proximal femur. The fracture zone would be determined according loading scenario (sideway fall, impact, accidental loads, etc., using the mechanical properties of bone obtained from the evolutionary model corresponding to the considered time. Results BMD evolution in untreated patients and in those under different treatments was analyzed. Evolutionary curves of fracture probability were obtained from the evolution of mechanical damage. The evolutionary curve of the untreated group of patients presented a marked increase of the fracture probability, while the curves of patients under drug treatment showed variable decreased risks, depending on the therapy type. Conclusion The FE model allowed to obtain detailed maps of damage and fracture probability, identifying high-risk local zones at femoral neck and intertrochanteric and subtrochanteric areas, which are the typical locations of

  7. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    Science.gov (United States)

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.

  8. Setting development goals using stochastic dynamical system models

    Science.gov (United States)

    Nicolis, Stamatios C.; Bali Swain, Ranjula; Sumpter, David J. T.

    2017-01-01

    The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers. PMID:28241057

  9. Affective Computing Model for the Set Pair Users on Twitter

    Directory of Open Access Journals (Sweden)

    Chunying Zhang

    2013-01-01

    Full Text Available Affective computing is the calculation about sentiment, sentiment generated and the aspects of affecting the sentiment. However, the different factors often cause the uncertainty of sentiment expression of the users. Today twitter as the information media of real-time and timely has become better sentiment expression vector for users themselves. Therefore, in allusion to the diversity of sentiment form of twitter information to express sentiment, this paper constructs affective computing model, starting from the differences of the constituted form of Twitter based on set pair theory to make analysis and calculation for user sentiment, from the text, emoticon, picture information and other multi-angle to analyze the positive, negative and uncertain emotion of the users for the signal twitter, consolidating the weight of various parts in emotional information, building hierarchical set pair affective computing model for twitter users, to offer more favorable data support for the relevant departments and businesses.

  10. Setting development goals using stochastic dynamical system models.

    Science.gov (United States)

    Ranganathan, Shyam; Nicolis, Stamatios C; Bali Swain, Ranjula; Sumpter, David J T

    2017-01-01

    The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers.

  11. Mathematical Modelling with Fuzzy Sets of Sustainable Tourism Development

    OpenAIRE

    Nenad Stojanović

    2011-01-01

    In the first part of the study we introduce fuzzy sets that correspond to comparative indicators for measuring sustainable development of tourism. In the second part of the study it is shown, on the base of model created, how one can determine the value of sustainable tourism development in protected areas based on the following established groups of indicators: to assess the economic status, to assess the impact of tourism on the social component, to assess the impact of tourism on cultural ...

  12. Comparison of the Mortality Probability Admission Model III, National Quality Forum, and Acute Physiology and Chronic Health Evaluation IV hospital mortality models: implications for national benchmarking*.

    Science.gov (United States)

    Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E

    2014-03-01

    To examine the accuracy of the original Mortality Probability Admission Model III, ICU Outcomes Model/National Quality Forum modification of Mortality Probability Admission Model III, and Acute Physiology and Chronic Health Evaluation IVa models for comparing observed and risk-adjusted hospital mortality predictions. Retrospective paired analyses of day 1 hospital mortality predictions using three prognostic models. Fifty-five ICUs at 38 U.S. hospitals from January 2008 to December 2012. Among 174,001 intensive care admissions, 109,926 met model inclusion criteria and 55,304 had data for mortality prediction using all three models. None. We compared patient exclusions and the discrimination, calibration, and accuracy for each model. Acute Physiology and Chronic Health Evaluation IVa excluded 10.7% of all patients, ICU Outcomes Model/National Quality Forum 20.1%, and Mortality Probability Admission Model III 24.1%. Discrimination of Acute Physiology and Chronic Health Evaluation IVa was superior with area under receiver operating curve (0.88) compared with Mortality Probability Admission Model III (0.81) and ICU Outcomes Model/National Quality Forum (0.80). Acute Physiology and Chronic Health Evaluation IVa was better calibrated (lowest Hosmer-Lemeshow statistic). The accuracy of Acute Physiology and Chronic Health Evaluation IVa was superior (adjusted Brier score = 31.0%) to that for Mortality Probability Admission Model III (16.1%) and ICU Outcomes Model/National Quality Forum (17.8%). Compared with observed mortality, Acute Physiology and Chronic Health Evaluation IVa overpredicted mortality by 1.5% and Mortality Probability Admission Model III by 3.1%; ICU Outcomes Model/National Quality Forum underpredicted mortality by 1.2%. Calibration curves showed that Acute Physiology and Chronic Health Evaluation performed well over the entire risk range, unlike the Mortality Probability Admission Model and ICU Outcomes Model/National Quality Forum models. Acute

  13. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  14. Grading the probabilities of credit default risk for Malaysian listed companies by using the KMV-Merton model

    Science.gov (United States)

    Anuwar, Muhammad Hafidz; Jaffar, Maheran Mohd

    2017-08-01

    This paper provides an overview for the assessment of credit risk specific to the banks. In finance, risk is a term to reflect the potential of financial loss. The risk of default on loan may increase when a company does not make a payment on that loan when the time comes. Hence, this framework analyses the KMV-Merton model to estimate the probabilities of default for Malaysian listed companies. In this way, banks can verify the ability of companies to meet their loan commitments in order to overcome bad investments and financial losses. This model has been applied to all Malaysian listed companies in Bursa Malaysia for estimating the credit default probabilities of companies and compare with the rating given by the rating agency, which is RAM Holdings Berhad to conform to reality. Then, the significance of this study is a credit risk grade is proposed by using the KMV-Merton model for the Malaysian listed companies.

  15. Reachable set modeling and engagement analysis of exoatmospheric interceptor

    Institute of Scientific and Technical Information of China (English)

    Chai Hua; Liang Yangang; Chen Lei; Tang Guojin

    2014-01-01

    A novel reachable set (RS) model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mech-anism of the interceptor are firstly explored. A mathematical model of the distorted RS is then for-mulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.

  16. Reachable set modeling and engagement analysis of exoatmospheric interceptor

    Directory of Open Access Journals (Sweden)

    Chai Hua

    2014-12-01

    Full Text Available A novel reachable set (RS model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mechanism of the interceptor are firstly explored. A mathematical model of the distorted RS is then formulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.

  17. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  18. Statistical model for degraded DNA samples and adjusted probabilities for allelic drop-out

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    -outs. In this paper, we present a method for measuring the degree of degradation of a sample and demonstrate how to incorporate this in estimating the probability of allelic drop-out. This is done by extending an existing method derived for non-degraded samples. The performance of the methodology is evaluated using......DNA samples found at a scene of crime or obtained from the debris of a mass disaster accident are often subject to degradation. When using the STR DNA technology, the DNA profile is observed via a so-called electropherogram (EPG), where the alleles are identified as signal peaks above a certain...

  19. Taylor–Socolar Hexagonal Tilings as Model Sets

    Directory of Open Access Journals (Sweden)

    Jeong-Yup Lee

    2012-12-01

    Full Text Available The Taylor–Socolar tilings are regular hexagonal tilings of the plane but are distinguished in being comprised of hexagons of two colors in an aperiodic way. We place the Taylor–Socolar tilings into an algebraic setting, which allows one to see them directly as model sets and to understand the corresponding tiling hull along with its generic and singular parts. Although the tilings were originally obtained by matching rules and by substitution, our approach sets the tilings into the framework of a cut and project scheme and studies how the tilings relate to the corresponding internal space. The centers of the entire set of tiles of one tiling form a lattice Q in the plane. If XQ denotes the set of all Taylor–Socolar tilings with centers on Q, then XQ forms a natural hull under the standard local topology of hulls and is a dynamical system for the action of Q.The Q-adic completion Q of Q is a natural factor of XQ and the natural mapping XQ → Q is bijective except at a dense set of points of measure 0 in /Q. We show that XQ consists of three LI classes under translation. Two of these LI classes are very small, namely countable Q-orbits in XQ. The other is a minimal dynamical system, which maps surjectively to /Q and which is variously 2 : 1, 6 : 1, and 12 : 1 at the singular points. We further develop the formula of what determines the parity of the tiles of a tiling in terms of the coordinates of its tile centers. Finally we show that the hull of the parity tilings can be identified with the hull XQ; more precisely the two hulls are mutually locally derivable.

  20. A Hot Spots Ignition Probability Model for Low-Velocity Impacted Explosive Particles Based on the Particle Size and Distribution

    Directory of Open Access Journals (Sweden)

    Hong-fu Guo

    2017-01-01

    Full Text Available Particle size and distribution play an important role in ignition. The size and distribution of the cyclotetramethylene tetranitramine (HMX particles were investigated by Laser Particle Size Analyzer Malvern MS2000 before experiment and calculation. The mean size of particles is 161 μm. Minimum and maximum sizes are 80 μm and 263 μm, respectively. The distribution function is like a quadratic function. Based on the distribution of micron scale explosive particles, a microscopic model is established to describe the process of ignition of HMX particles under drop weight. Both temperature of contact zones and ignition probability of powder explosive can be predicted. The calculated results show that the temperature of the contact zones between the particles and the drop weight surface increases faster and higher than that of the contact zones between two neighboring particles. For HMX particles, with all other conditions being kept constant, if the drop height is less than 0.1 m, ignition probability will be close to 0. When the drop heights are 0.2 m and 0.3 m, the ignition probability is 0.27 and 0.64, respectively, whereas when the drop height is more than 0.4 m, ignition probability will be close to 0.82. In comparison with experimental results, the two curves are reasonably close to each other, which indicates our model has a certain degree of rationality.

  1. Measuring and Modeling Fault Density for Plume-Fault Encounter Probability Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, P.D.; Oldenburg, C.M.; Nicot, J.-P.

    2011-05-15

    Emission of carbon dioxide from fossil-fueled power generation stations contributes to global climate change. Storage of this carbon dioxide within the pores of geologic strata (geologic carbon storage) is one approach to mitigating the climate change that would otherwise occur. The large storage volume needed for this mitigation requires injection into brine-filled pore space in reservoir strata overlain by cap rocks. One of the main concerns of storage in such rocks is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available. This necessitates a method for using available fault data to develop an estimate of the likelihood of injected carbon dioxide encountering and migrating up a fault, primarily due to buoyancy. Fault population statistics provide one of the main inputs to calculate the encounter probability. Previous fault population statistics work is shown to be applicable to areal fault density statistics. This result is applied to a case study in the southern portion of the San Joaquin Basin with the result that the probability of a carbon dioxide plume from a previously planned injection had a 3% chance of encountering a fully seal offsetting fault.

  2. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  3. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  4. A systematic approach to selecting the best probability models for annual maximum rainfalls - A case study using data in Ontario (Canada)

    Science.gov (United States)

    Nguyen, Truong-Huy; El Outayek, Sarah; Lim, Sun Hee; Nguyen, Van-Thanh-Van

    2017-10-01

    Many probability distributions have been developed to model the annual maximum rainfall series (AMS). However, there is no general agreement as to which distribution should be used due to the lack of a suitable evaluation method. This paper presents hence a general procedure for assessing systematically the performance of ten commonly used probability distributions in rainfall frequency analyses based on their descriptive as well as predictive abilities. This assessment procedure relies on an extensive set of graphical and numerical performance criteria to identify the most suitable models that could provide the most accurate and most robust extreme rainfall estimates. The proposed systematic assessment approach has been shown to be more efficient and more robust than the traditional model selection method based on only limited goodness-of-fit criteria. To test the feasibility of the proposed procedure, an illustrative application was carried out using 5-min, 1-h, and 24-h annual maximum rainfall data from a network of 21 raingages located in the Ontario region in Canada. Results have indicated that the GEV, GNO, and PE3 models were the best models for describing the distribution of daily and sub-daily annual maximum rainfalls in this region. The GEV distribution, however, was preferred to the GNO and PE3 because it was based on a more solid theoretical basis for representing the distribution of extreme random variables.

  5. A NEW DEFORMABLE MODEL USING LEVEL SETS FOR SHAPE SEGMENTALTION

    Institute of Scientific and Technical Information of China (English)

    He Ning; Zhang Peng; Lu Ke

    2009-01-01

    In this paper,we present a new deformable model for shape segmentation,which makes two modifications to the original level set implementation of deformable models.The modifications are motivated by difficulties that we have encountered in applying deformable models to segmentation of medical images.The level set algorithm has some advantages over the classical snake deformable models.However,it could develop large gaps in the boundary and holes within the objects.Such boundary gaps and holes of objects can cause inaccurate segmentation that requires manual correction.The proposed method in this paper possesses an inherent property to detect gaps and holes within the object with a single initial contour and also does not require specific initialization.The first modification is to replace the edge detector by some area constraint,and the second modification utilizes weighted length constraint to regularize the curve under evolution.The proposed method has been applied to both synthetic and real images with promising results.

  6. OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success

    Directory of Open Access Journals (Sweden)

    Cheng Zhu

    2014-01-01

    Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.

  7. Probability of Inconsistencies in Theory Revision: A multi-agent model for updating logically interconnected beliefs under bounded confidence

    CERN Document Server

    Wenmackers, Sylvia; Douven, Igor

    2014-01-01

    We present a model for studying communities of epistemically interacting agents who update their belief states by averaging (in a specified way) the belief states of other agents in the community. The agents in our model have a rich belief state, involving multiple independent issues which are interrelated in such a way that they form a theory of the world. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating (in the given way). To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. It is shown that, under the assumptions of our model, an agent always has a probability of less than 2% of ending up in an inconsistent belief state. Moreover, this probability can be made arbitrarily small by increasing the number of independent issues the agents have to judge or by increasing the group size. A real-world situation to which this model applies is a group of experts participating in a Delp...

  8. A Graph Model for calculating the probability for a moving cyclic disturbance interacting at a particular spatial position

    CERN Document Server

    Brown, D

    2003-01-01

    The analysis follows an earlier paper - Brown (2003) - which analysed a moving disturbance using a directed cyclic graph defined as Interrelated Fluctuating Entities (IFEs) of /STATE/, /SPACE/, /alphaTIME/, /betaTIME/. This paper provides a statistical analysis of the alternative positions in space and state of an IFE for a defined total time magnitude. The probability for a freely moving entity interacting in a particular spatial position is calculated and a formulation is derived for the minimum locus of uncertainty in position and momentum. The model has proven amenable to computer modelling (the assistance of University College London Computer Science department is gratefully acknowledged). A computer model is available on request.

  9. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  10. Increasing the Probability of Fault Detection in Non Perfect Inspection Model of Delay Time Analysis with Compromise on Inspection Time

    Directory of Open Access Journals (Sweden)

    Babu Chellappachetty

    2014-09-01

    Full Text Available This study separates the real inspection content (soft portion within the total maintenance inspection activity and attempts to repeat the same some additional number of times during actual inspection. Effect of repetition of soft portion on inspection related time, fault detection probability and the consequence variable of down time per unit time is analyzed. Statistical test proves that both the inspection time and probability of fault detection has nearly same rate of influence on the consequence variable though in opposite direction. A factor ω is introduced to account for the proportion of soft portion over the maintenance inspection time. As the number of repetitions of soft portion is increased for a given value of ω, it is found from analysis that the new set of inspection time and probability of fault detection improves downtime per unit time until an optimum number of repetitions is reached. Improvement is better as the value of ω is on the lower side. The practitioner is to take this possibility of soft repeatable portion of maintenance inspection time into account while estimating these two input parameters when employing delay time methodology as a preventive maintenance strategy.

  11. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  12. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  13. Setting time limits on tests

    NARCIS (Netherlands)

    Linden, van der Wim J.

    2011-01-01

    It is shown how the time limit on a test can be set to control the probability of a test taker running out of time before completing it. The probability is derived from the item parameters in the lognormal model for response times. Examples of curves representing the probability of running out of ti

  14. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    Science.gov (United States)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  15. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  16. Preference Mining Using Neighborhood Rough Set Model on Two Universes

    Science.gov (United States)

    2016-01-01

    Preference mining plays an important role in e-commerce and video websites for enhancing user satisfaction and loyalty. Some classical methods are not available for the cold-start problem when the user or the item is new. In this paper, we propose a new model, called parametric neighborhood rough set on two universes (NRSTU), to describe the user and item data structures. Furthermore, the neighborhood lower approximation operator is used for defining the preference rules. Then, we provide the means for recommending items to users by using these rules. Finally, we give an experimental example to show the details of NRSTU-based preference mining for cold-start problem. The parameters of the model are also discussed. The experimental results show that the proposed method presents an effective solution for preference mining. In particular, NRSTU improves the recommendation accuracy by about 19% compared to the traditional method. PMID:28044074

  17. Preference Mining Using Neighborhood Rough Set Model on Two Universes.

    Science.gov (United States)

    Zeng, Kai

    2016-01-01

    Preference mining plays an important role in e-commerce and video websites for enhancing user satisfaction and loyalty. Some classical methods are not available for the cold-start problem when the user or the item is new. In this paper, we propose a new model, called parametric neighborhood rough set on two universes (NRSTU), to describe the user and item data structures. Furthermore, the neighborhood lower approximation operator is used for defining the preference rules. Then, we provide the means for recommending items to users by using these rules. Finally, we give an experimental example to show the details of NRSTU-based preference mining for cold-start problem. The parameters of the model are also discussed. The experimental results show that the proposed method presents an effective solution for preference mining. In particular, NRSTU improves the recommendation accuracy by about 19% compared to the traditional method.

  18. Modeling cellular deformations using the level set formalism

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2008-07-01

    Full Text Available Abstract Background Many cellular processes involve substantial shape changes. Traditional simulations of these cell shape changes require that grids and boundaries be moved as the cell's shape evolves. Here we demonstrate that accurate cell shape changes can be recreated using level set methods (LSM, in which the cellular shape is defined implicitly, thereby eschewing the need for updating boundaries. Results We obtain a viscoelastic model of Dictyostelium cells using micropipette aspiration and show how this viscoelastic model can be incorporated into LSM simulations to recreate the observed protrusion of cells into the micropipette faithfully. We also demonstrate the use of our techniques by simulating the cell shape changes elicited by the chemotactic response to an external chemoattractant gradient. Conclusion Our results provide a simple but effective means of incorporating cellular deformations into mathematical simulations of cell signaling. Such methods will be useful for simulating important cellular events such as chemotaxis and cytokinesis.

  19. A semi-empirical model to predict the probability of capture of buoyant particles by a cylindrical collector through capillarity

    Science.gov (United States)

    Peruzzo, Paolo; Pietro Viero, Daniele; Defina, Andrea

    2016-11-01

    The seeds of many aquatic plants, as well as many propagulae and larvae, are buoyant and transported at the water surface. These particles are therefore subject to surface tension, which may enhance their capture by emergent vegetation through capillary attraction. In this work, we develop a semi-empirical model that predicts the probability that a floating particle is retained by plant stems and branches piercing the water surface, due to capillarity, against the drag force exerted by the flowing water. Specific laboratory experiments are also performed to calibrate and validate the model.

  20. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  1. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    Science.gov (United States)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  2. PROBABILITY MODEL AND SOLUTION ON EARTHQUAKE EFFECTS COMBINATION IN ALONG WIND RESISTANT DESIGN OF TALL-FLEXIBLE BUILDINGS

    Institute of Scientific and Technical Information of China (English)

    HONG Xiao-jian; GU Ming

    2006-01-01

    A model on the earthquake effects combination in wind resistant design of high-rise flexible structures is proposed in accordance with the probability method. Based on the Turkstra criteria, the stochastic characters of wind velocity, earthquake ground acceleration and excitations occurrence probability are taken into account and then the combination of the earthquake effects in structure wind resistant design is analyzed with the convolution approach. The results indicate that as for the tall flexible buildings whose lateral force is governed by wind loading, the maximum lateral loads verification with respect to the wind resistant design combined with earthquake effects may be more unfavorable compared with that in terms of the earthquake resistant design involving wind effects.

  3. Model-based prognostics for batteries which estimates useful life and uses a probability density function

    Data.gov (United States)

    National Aeronautics and Space Administration — This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form...

  4. DEVELOPMENT PROBABILITY-LINGUISTIC MODELS VULNERABILITY ASSESSMENT OF AVIATION SECURITY IMPORTANT TECHNICAL FACILITIES

    National Research Council Canada - National Science Library

    2016-01-01

    ... are justified, and the assessment problem of the protected object vulnerability is formulated. The main advantage of the developed model is the extensive opportunities of formalization of diverse information on the security status of the object...

  5. Model-based prognostics for batteries which estimates useful life and uses a probability density function

    Science.gov (United States)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor)

    2012-01-01

    This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form of the model has been linked to the internal processes of the battery and validated using experimental data. Effects of temperature and load current have also been incorporated into the model. Subsequently, the model has been used in a Particle Filtering framework to make predictions of remaining useful life for individual discharge cycles as well as for cycle life. The prediction performance was found to be satisfactory as measured by performance metrics customized for prognostics for a sample case. The work presented here provides initial steps towards a comprehensive health management solution for energy storage devices.

  6. Detection of the optic disc in fundus images by combining probability models.

    Science.gov (United States)

    Harangi, Balazs; Hajdu, Andras

    2015-10-01

    In this paper, we propose a combination method for the automatic detection of the optic disc (OD) in fundus images based on ensembles of individual algorithms. We have studied and adapted some of the state-of-the-art OD detectors and finally organized them into a complex framework in order to maximize the accuracy of the localization of the OD. The detection of the OD can be considered as a single-object detection problem. This object can be localized with high accuracy by several algorithms extracting single candidates for the center of the OD and the final location can be defined using a single majority voting rule. To include more information to support the final decision, we can use member algorithms providing more candidates which can be ranked based on the confidence ordered by the algorithms. In this case, a spatial weighted graph is defined where the candidates are considered as its nodes, and the final OD position is determined in terms of finding a maximum-weighted clique. Now, we examine how to apply in our ensemble-based framework all the accessible information supplied by the member algorithms by making them return confidence values for each image pixel. These confidence values inform us about the probability that a given pixel is the center point of the object. We apply axiomatic and Bayesian approaches, as in the case of aggregation of judgments of experts in decision and risk analysis, to combine these confidence values. According to our experimental study, the accuracy of the localization of OD increases further. Besides single localization, this approach can be adapted for the precise detection of the boundary of the OD. Comparative experimental results are also given for several publicly available datasets.

  7. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    Science.gov (United States)

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed.

  8. A study of quantum mechanical probabilities in the classical Hodgkin-Huxley model.

    Science.gov (United States)

    Moradi, N; Scholkmann, F; Salari, V

    2015-03-01

    The Hodgkin-Huxley (HH) model is a powerful model to explain different aspects of spike generation in excitable cells. However, the HH model was proposed in 1952 when the real structure of the ion channel was unknown. It is now common knowledge that in many ion-channel proteins the flow of ions through the pore is governed by a gate, comprising a so-called "selectivity filter" inside the ion channel, which can be controlled by electrical interactions. The selectivity filter (SF) is believed to be responsible for the selection and fast conduction of particular ions across the membrane of an excitable cell. Other (generally larger) parts of the molecule such as the pore-domain gate control the access of ions to the channel protein. In fact, two types of gates are considered here for ion channels: the "external gate", which is the voltage sensitive gate, and the "internal gate" which is the selectivity filter gate (SFG). Some quantum effects are expected in the SFG due to its small dimensions, which may play an important role in the operation of an ion channel. Here, we examine parameters in a generalized model of HH to see whether any parameter affects the spike generation. Our results indicate that the previously suggested semi-quantum-classical equation proposed by Bernroider and Summhammer (BS) agrees strongly with the HH equation under different conditions and may even provide a better explanation in some cases. We conclude that the BS model can refine the classical HH model substantially.

  9. Information behavior versus communication: application models in multidisciplinary settings

    Directory of Open Access Journals (Sweden)

    Cecília Morena Maria da Silva

    2015-05-01

    Full Text Available This paper deals with the information behavior as support for models of communication design in the areas of Information Science, Library and Music. The communication models proposition is based on models of Tubbs and Moss (2003, Garvey and Griffith (1972, adapted by Hurd (1996 and Wilson (1999. Therefore, the questions arose: (i what are the informational skills required of librarians who act as mediators in scholarly communication process and informational user behavior in the educational environment?; (ii what are the needs of music related researchers and as produce, seek, use and access the scientific knowledge of your area?; and (iii as the contexts involved in scientific collaboration processes influence in the scientific production of information science field in Brazil? The article includes a literature review on the information behavior and its insertion in scientific communication considering the influence of context and/or situation of the objects involved in motivating issues. The hypothesis is that the user information behavior in different contexts and situations influence the definition of a scientific communication model. Finally, it is concluded that the same concept or a set of concepts can be used in different perspectives, reaching up, thus, different results.

  10. A Rough Set Bounded Spatially Constrained Asymmetric Gaussian Mixture Model for Image Segmentation.

    Science.gov (United States)

    Ji, Zexuan; Huang, Yubo; Sun, Quansen; Cao, Guo; Zheng, Yuhui

    2017-01-01

    Accurate image segmentation is an important issue in image processing, where Gaussian mixture models play an important part and have been proven effective. However, most Gaussian mixture model (GMM) based methods suffer from one or more limitations, such as limited noise robustness, over-smoothness for segmentations, and lack of flexibility to fit data. In order to address these issues, in this paper, we propose a rough set bounded asymmetric Gaussian mixture model with spatial constraint for image segmentation. First, based on our previous work where each cluster is characterized by three automatically determined rough-fuzzy regions, we partition the target image into three rough regions with two adaptively computed thresholds. Second, a new bounded indicator function is proposed to determine the bounded support regions of the observed data. The bounded indicator and posterior probability of a pixel that belongs to each sub-region is estimated with respect to the rough region where the pixel lies. Third, to further reduce over-smoothness for segmentations, two novel prior factors are proposed that incorporate the spatial information among neighborhood pixels, which are constructed based on the prior and posterior probabilities of the within- and between-clusters, and considers the spatial direction. We compare our algorithm to state-of-the-art segmentation approaches in both synthetic and real images to demonstrate the superior performance of the proposed algorithm.

  11. KFUPM-KAUST Red Sea model: Digital viscoelastic depth model and synthetic seismic data set

    KAUST Repository

    Al-Shuhail, Abdullatif A.

    2017-06-01

    The Red Sea is geologically interesting due to its unique structures and abundant mineral and petroleum resources, yet no digital geologic models or synthetic seismic data of the Red Sea are publicly available for testing algorithms to image and analyze the area\\'s interesting features. This study compiles a 2D viscoelastic model of the Red Sea and calculates a corresponding multicomponent synthetic seismic data set. The models and data sets are made publicly available for download. We hope this effort will encourage interested researchers to test their processing algorithms on this data set and model and share their results publicly as well.

  12. Birth-death models and coalescent point processes: the shape and probability of reconstructed phylogenies.

    Science.gov (United States)

    Lambert, Amaury; Stadler, Tanja

    2013-12-01

    Forward-in-time models of diversification (i.e., speciation and extinction) produce phylogenetic trees that grow "vertically" as time goes by. Pruning the extinct lineages out of such trees leads to natural models for reconstructed trees (i.e., phylogenies of extant species). Alternatively, reconstructed trees can be modelled by coalescent point processes (CPPs), where trees grow "horizontally" by the sequential addition of vertical edges. Each new edge starts at some random speciation time and ends at the present time; speciation times are drawn from the same distribution independently. CPPs lead to extremely fast computation of tree likelihoods and simulation of reconstructed trees. Their topology always follows the uniform distribution on ranked tree shapes (URT). We characterize which forward-in-time models lead to URT reconstructed trees and among these, which lead to CPP reconstructed trees. We show that for any "asymmetric" diversification model in which speciation rates only depend on time and extinction rates only depend on time and on a non-heritable trait (e.g., age), the reconstructed tree is CPP, even if extant species are incompletely sampled. If rates additionally depend on the number of species, the reconstructed tree is (only) URT (but not CPP). We characterize the common distribution of speciation times in the CPP description, and discuss incomplete species sampling as well as three special model cases in detail: (1) the extinction rate does not depend on a trait; (2) rates do not depend on time; (3) mass extinctions may happen additionally at certain points in the past. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Argumentation, Dialogue Theory, and Probability Modeling: Alternative Frameworks for Argumentation Research in Education

    Science.gov (United States)

    Nussbaum, E. Michael

    2011-01-01

    Toulmin's model of argumentation, developed in 1958, has guided much argumentation research in education. However, argumentation theory in philosophy and cognitive science has advanced considerably since 1958. There are currently several alternative frameworks of argumentation that can be useful for both research and practice in education. These…

  14. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    Science.gov (United States)

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  15. Probability density function shape sensitivity in the statistical modeling of turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.

  16. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    Science.gov (United States)

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  17. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome.

  18. Modeling daily flowering probabilities: expected impact of climate change on Japanese cherry phenology.

    Science.gov (United States)

    Allen, Jenica M; Terres, Maria A; Katsuki, Toshio; Iwamoto, Kojiro; Kobori, Hiromi; Higuchi, Hiroyoshi; Primack, Richard B; Wilson, Adam M; Gelfand, Alan; Silander, John A

    2014-04-01

    Understanding the drivers of phenological events is vital for forecasting species' responses to climate change. We developed flexible Bayesian survival regression models to assess a 29-year, individual-level time series of flowering phenology from four taxa of Japanese cherry trees (Prunus spachiana, Prunus × yedoensis, Prunus jamasakura, and Prunus lannesiana), from the Tama Forest Cherry Preservation Garden in Hachioji, Japan. Our modeling framework used time-varying (chill and heat units) and time-invariant (slope, aspect, and elevation) factors. We found limited differences among taxa in sensitivity to chill, but earlier flowering taxa, such as P. spachiana, were more sensitive to heat than later flowering taxa, such as P. lannesiana. Using an ensemble of three downscaled regional climate models under the A1B emissions scenario, we projected shifts in flowering timing by 2100. Projections suggest that each taxa will flower about 30 days earlier on average by 2100 with 2-6 days greater uncertainty around the species mean flowering date. Dramatic shifts in the flowering times of cherry trees may have implications for economically important cultural festivals in Japan and East Asia. The survival models used here provide a mechanistic modeling approach and are broadly applicable to any time-to-event phenological data, such as plant leafing, bird arrival time, and insect emergence. The ability to explicitly quantify uncertainty, examine phenological responses on a fine time scale, and incorporate conditions leading up to an event may provide future insight into phenologically driven changes in carbon balance and ecological mismatches of plants and pollinators in natural populations and horticultural crops.

  19. Mathematical Modelling with Fuzzy Sets of Sustainable Tourism Development

    Directory of Open Access Journals (Sweden)

    Nenad Stojanović

    2011-10-01

    Full Text Available In the first part of the study we introduce fuzzy sets that correspond to comparative indicators for measuring sustainable development of tourism. In the second part of the study it is shown, on the base of model created, how one can determine the value of sustainable tourism development in protected areas based on the following established groups of indicators: to assess the economic status, to assess the impact of tourism on the social component, to assess the impact of tourism on cultural identity, to assess the environmental conditions and indicators as well as to assess tourist satisfaction, all using fuzzy logic.It is also shown how to test the confidence in the rules by which, according to experts, appropriate decisions can be created in order to protect biodiversity of protected areas.

  20. Gray Cerebrovascular Image Skeleton Extraction Algorithm Using Level Set Model

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2010-06-01

    Full Text Available The ambiguity and complexity of medical cerebrovascular image makes the skeleton gained by conventional skeleton algorithm discontinuous, which is sensitive at the weak edges, with poor robustness and too many burrs. This paper proposes a cerebrovascular image skeleton extraction algorithm based on Level Set model, using Euclidean distance field and improved gradient vector flow to obtain two different energy functions. The first energy function controls the  obtain of topological nodes for the beginning of skeleton curve. The second energy function controls the extraction of skeleton surface. This algorithm avoids the locating and classifying of the skeleton connection points which guide the skeleton extraction. Because all its parameters are gotten by the analysis and reasoning, no artificial interference is needed.

  1. Modeling of Kidney Hemodynamics: Probability-Based Topology of an Arterial Network

    DEFF Research Database (Denmark)

    Postnov, Dmitry; Marsh, Donald; Postnov, D.E.;

    2016-01-01

    Through regulation of the extracellular fluid volume, the kidneys provide important long-term regulation of blood pressure. At the level of the individual functional unit (the nephron), pressure and flow control involves two different mechanisms that both produce oscillations. The nephrons...... are arranged in a complex branching structure that delivers blood to each nephron and, at the same time, provides a basis for an interaction between adjacent nephrons. The functional consequences of this interaction are not understood, and at present it is not possible to address this question experimentally......CT) data we develop an algorithm for generating the renal arterial network. We then introduce a mathematical model describing blood flow dynamics and nephron to nephron interaction in the network. The model includes an implementation of electrical signal propagation along a vascular wall. Simulation...

  2. AN EFFECTIVE MODEL TO EVALUATE BLOCKING PROBABILITY OF TIME-SLOTTED OPTICAL BURST SWITCHED NETWORKS

    Institute of Scientific and Technical Information of China (English)

    Yang Zongkai; Ou Liang; Tan Xiansi

    2006-01-01

    Time-slotted optical burst switched network is a potential technique to support IP over Wavelength Division Multiplexing (WDM) by introduce Time Division Multiplexing (TDM) channel to Optical Burst Switching (OBS) technology. This paper presents a framework to evaluate blocking performance of time-slotted OBS networks with multi-fiber wavelength channels. The proposed model is efficient for not only single class traffic such as individual circuit switch traffics or best-effort traffics but also mixed multi-class traffics.The effectiveness of the proposed model is validated by simulation results. The study shows that blocking performance of multi-fiber TS-OBS network is acceptable for future Internet services.

  3. A Comprehensive Propagation Prediction Model Comprising Microfacet Based Scattering and Probability Based Coverage Optimization Algorithm

    OpenAIRE

    A. S. M. Zahid Kausar; Ahmed Wasif Reza; Lau Chun Wo; Harikrishnan Ramiah

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented...

  4. A Comprehensive Propagation Prediction Model Comprising Microfacet Based Scattering and Probability Based Coverage Optimization Algorithm

    OpenAIRE

    Kausar, A. S. M. Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented ...

  5. Effect of Data Model Approach In State Probability Analysis Of Multi-Level Queue Scheduling

    Directory of Open Access Journals (Sweden)

    Diwakar Shukla

    2010-07-01

    Full Text Available In the uniprocessor environment, the number of jobs arriving at the processor of CPU at a time is very large which causes a long waiting queue. When conflict arises due to shared resources or overlap of instructions or logical error, the deadlock state appears where further processing of jobs is blocked completely. While the scheduler jumps from one job to another in order to perform the processing the transition mechanism appears. This paper presents a general structure of transition scenario for the functioning of CPU scheduler in the presence of deadlock condition in setup of multilevel queue scheduling. A data model based Markov chain model is proposed to study the transition phenomenon and a general class of scheduling scheme is designed. Some specific and well known schemes are treated as its particular cases and are compared under the setup of model through a proposed deadlock-waiting index measure. Simulation study is performed to evaluate the comparative merits of specific schemes belonging to the class designed with the help of varying values of α and d.

  6. Accuracies of the empirical theories of the escape probability based on Eigen model and Braun model compared with the exact extension of Onsager theory.

    Science.gov (United States)

    Wojcik, Mariusz; Tachiya, M

    2009-03-14

    This paper deals with the exact extension of the original Onsager theory of the escape probability to the case of finite recombination rate at nonzero reaction radius. The empirical theories based on the Eigen model and the Braun model, which are applicable in the absence and presence of an external electric field, respectively, are based on a wrong assumption that both recombination and separation processes in geminate recombination follow exponential kinetics. The accuracies of the empirical theories are examined against the exact extension of the Onsager theory. The Eigen model gives the escape probability in the absence of an electric field, which is different by a factor of 3 from the exact one. We have shown that this difference can be removed by operationally redefining the volume occupied by the dissociating partner before dissociation, which appears in the Eigen model as a parameter. The Braun model gives the escape probability in the presence of an electric field, which is significantly different from the exact one over the whole range of electric fields. Appropriate modification of the original Braun model removes the discrepancy at zero or low electric fields, but it does not affect the discrepancy at high electric fields. In all the above theories it is assumed that recombination takes place only at the reaction radius. The escape probability in the case when recombination takes place over a range of distances is also calculated and compared with that in the case of recombination only at the reaction radius.

  7. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  8. Normal Tissue Complication Probability Modeling of Radiation-Induced Hypothyroidism After Head-and-Neck Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bakhshandeh, Mohsen [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Hashemi, Bijan, E-mail: bhashemi@modares.ac.ir [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Mahdavi, Seied Rabi Mehdi [Department of Medical Physics, Faculty of Medical Sciences, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Nikoofar, Alireza; Vasheghani, Maryam [Department of Radiation Oncology, Hafte-Tir Hospital, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kazemnejad, Anoshirvan [Department of Biostatistics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of)

    2013-02-01

    Purpose: To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Methods and Materials: Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with {alpha}/{beta} = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Results: Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D{sub 50} estimated from the models was approximately 44 Gy. Conclusions: The implemented

  9. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  10. Minimal axiom group of similarity-based rough set model

    Institute of Scientific and Technical Information of China (English)

    DAI Jian-hua; PAN Yun-he

    2006-01-01

    Rough set axiomatization is one aspect of rough set study to characterize rough set theory using dependable and minimal axiom groups.Thus,rough set theory can be studied by logic and axiom system methods.The classical rough set theory is based on equivalence relation,but the rough set theory based on similarity relation has wide applications in the real world.To characterize similarity-based rough set theory,an axiom group named S,consisting of 3 axioms,is proposed.The reliability of the axiom group,which shows that characterizing of rough set theory based on similarity relation is rational,is proved.Simultaneously,the minimization of the axiom group,which requests that each axiom is an equation and independent,is proved.The axiom group is helpful to research rough set theory by logic and axiom system methods.

  11. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    both aleatory (stochas-tic) and epistemic uncertainty and the flexibility with which information can be represented. The previous research of the authors related to generalizing structural reliability models to impre-cise statistical measures is summarized in Utkin & Kozine (2002) and Utkin (2004...... the above mentioned inputs do not exist and the analyst has on-ly some judgments or measurements (observations) of values of stress and strength. How to utilize this available information for computing the structural reliability and what to do if the number of judgments or measurements is very small...

  12. Transition probabilities and interacting boson-fermion model description of positive parity states in 117Sb

    Science.gov (United States)

    Lobach, Yu. N.; Bucurescu, D.

    1998-09-01

    The Doppler shift attenuation method was used to determine lifetimes in the picosecond region for excited states of 117Sb populated with the (α,2nγ) reaction at Eα=27.2 MeV. Interacting boson-fermion model calculations explain reasonably well the main features of the positive parity levels known up to about 2.5 MeV excitation. The mixing of the lowest one-quasiparticle 9/2+ state with the intruder (2p-1h) 9/2+ state, as well as the quadrupole deformation of the intruder band are also discussed.

  13. Exact valence bond entanglement entropy and probability distribution in the XXX spin chain and the potts model.

    Science.gov (United States)

    Jacobsen, J L; Saleur, H

    2008-02-29

    We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.

  14. Finite-size scaling of the magnetization probability density for the critical Ising model in slab geometry

    Science.gov (United States)

    Lopes Cardozo, David; Holdsworth, Peter C. W.

    2016-04-01

    The magnetization probability density in d  =  2 and 3 dimensional Ising models in slab geometry of volume L\\paralleld-1× {{L}\\bot} is computed through Monte-Carlo simulation at the critical temperature and zero magnetic field. The finite-size scaling of this distribution and its dependence on the system aspect-ratio ρ =\\frac{{{L}\\bot}}{{{L}\\parallel}} and boundary conditions are discussed. In the limiting case ρ \\to 0 of a macroscopically large slab ({{L}\\parallel}\\gg {{L}\\bot} ) the distribution is found to scale as a Gaussian function for all tested system sizes and boundary conditions.

  15. CS-dependent response probability in an auditory masked-detection task: considerations based on models of Pavlovian conditioning.

    Science.gov (United States)

    Mason, Christine R; Idrobo, Fabio; Early, Susan J; Abibi, Ayome; Zheng, Ling; Harrison, J Michael; Carney, Laurel H

    2003-05-01

    Experimental studies were performed using a Pavlovian-conditioned eyeblink response to measure detection of a variable-sound-level tone (T) in a fixed-sound-level masking noise (N) in rabbits. Results showed an increase in the asymptotic probability of conditioned responses (CRs) to the reinforced TN trials and a decrease in the asymptotic rate of eyeblink responses to the non-reinforced N presentations as a function of the sound level of the T. These observations are consistent with expected behaviour in an auditory masked detection task, but they are not consistent with predictions from a traditional application of the Rescorla-Wagner or Pearce models of associative learning. To implement these models, one typically considers only the actual stimuli and reinforcement on each trial. We found that by considering perceptual interactions and concepts from signal detection theory, these models could predict the CS dependence on the sound level of the T. In these alternative implementations, the animals response probabilities were used as a guide in making assumptions about the "effective stimuli".

  16. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  17. Computation of steady-state probability distributions in stochastic models of cellular networks.

    Directory of Open Access Journals (Sweden)

    Mark Hallen

    2011-10-01

    Full Text Available Cellular processes are "noisy". In each cell, concentrations of molecules are subject to random fluctuations due to the small numbers of these molecules and to environmental perturbations. While noise varies with time, it is often measured at steady state, for example by flow cytometry. When interrogating aspects of a cellular network by such steady-state measurements of network components, a key need is to develop efficient methods to simulate and compute these distributions. We describe innovations in stochastic modeling coupled with approaches to this computational challenge: first, an approach to modeling intrinsic noise via solution of the chemical master equation, and second, a convolution technique to account for contributions of extrinsic noise. We show how these techniques can be combined in a streamlined procedure for evaluation of different sources of variability in a biochemical network. Evaluation and illustrations are given in analysis of two well-characterized synthetic gene circuits, as well as a signaling network underlying the mammalian cell cycle entry.

  18. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    Science.gov (United States)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  19. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  20. Binary logistic regression modelling: Measuring the probability of relapse cases among drug addict

    Science.gov (United States)

    Ismail, Mohd Tahir; Alias, Siti Nor Shadila

    2014-07-01

    For many years Malaysia faced the drug addiction issues. The most serious case is relapse phenomenon among treated drug addict (drug addict who have under gone the rehabilitation programme at Narcotic Addiction Rehabilitation Centre, PUSPEN). Thus, the main objective of this study is to find the most significant factor that contributes to relapse to happen. The binary logistic regression analysis was employed to model the relationship between independent variables (predictors) and dependent variable. The dependent variable is the status of the drug addict either relapse, (Yes coded as 1) or not, (No coded as 0). Meanwhile the predictors involved are age, age at first taking drug, family history, education level, family crisis, community support and self motivation. The total of the sample is 200 which the data are provided by AADK (National Antidrug Agency). The finding of the study revealed that age and self motivation are statistically significant towards the relapse cases..

  1. Hierarchical set of models to estimate soil thermal diffusivity

    Science.gov (United States)

    Arkhangelskaya, Tatiana; Lukyashchenko, Ksenia

    2016-04-01

    Soil thermal properties significantly affect the land-atmosphere heat exchange rates. Intra-soil heat fluxes depend both on temperature gradients and soil thermal conductivity. Soil temperature changes due to energy fluxes are determined by soil specific heat. Thermal diffusivity is equal to thermal conductivity divided by volumetric specific heat and reflects both the soil ability to transfer heat and its ability to change temperature when heat is supplied or withdrawn. The higher soil thermal diffusivity is, the thicker is the soil/ground layer in which diurnal and seasonal temperature fluctuations are registered and the smaller are the temperature fluctuations at the soil surface. Thermal diffusivity vs. moisture dependencies for loams, sands and clays of the East European Plain were obtained using the unsteady-state method. Thermal diffusivity of different soils differed greatly, and for a given soil it could vary by 2, 3 or even 5 times depending on soil moisture. The shapes of thermal diffusivity vs. moisture dependencies were different: peak curves were typical for sandy soils and sigmoid curves were typical for loamy and especially for compacted soils. The lowest thermal diffusivities and the smallest range of their variability with soil moisture were obtained for clays with high humus content. Hierarchical set of models will be presented, allowing an estimate of soil thermal diffusivity from available data on soil texture, moisture, bulk density and organic carbon. When developing these models the first step was to parameterize the experimental thermal diffusivity vs. moisture dependencies with a 4-parameter function; the next step was to obtain regression formulas to estimate the function parameters from available data on basic soil properties; the last step was to evaluate the accuracy of suggested models using independent data on soil thermal diffusivity. The simplest models were based on soil bulk density and organic carbon data and provided different

  2. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  3. Development of a statistical model for the determination of the probability of riverbank erosion in a Meditteranean river basin

    Science.gov (United States)

    Varouchakis, Emmanouil; Kourgialas, Nektarios; Karatzas, George; Giannakis, Georgios; Lilli, Maria; Nikolaidis, Nikolaos

    2014-05-01

    Riverbank erosion affects the river morphology and the local habitat and results in riparian land loss, damage to property and infrastructures, ultimately weakening flood defences. An important issue concerning riverbank erosion is the identification of the areas vulnerable to erosion, as it allows for predicting changes and assists with stream management and restoration. One way to predict the vulnerable to erosion areas is to determine the erosion probability by identifying the underlying relations between riverbank erosion and the geomorphological and/or hydrological variables that prevent or stimulate erosion. A statistical model for evaluating the probability of erosion based on a series of independent local variables and by using logistic regression is developed in this work. The main variables affecting erosion are vegetation index (stability), the presence or absence of meanders, bank material (classification), stream power, bank height, river bank slope, riverbed slope, cross section width and water velocities (Luppi et al. 2009). In statistics, logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable, e.g. binary response, based on one or more predictor variables (continuous or categorical). The probabilities of the possible outcomes are modelled as a function of independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. 1 = "presence of erosion" and 0 = "no erosion") for any value of the independent variables. The regression coefficients are estimated by using maximum likelihood estimation. The erosion occurrence probability can be calculated in conjunction with the model deviance regarding

  4. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    Science.gov (United States)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  5. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    Science.gov (United States)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2017-02-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  6. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  7. Hidden Variables or Positive Probabilities?

    CERN Document Server

    Rothman, T; Rothman, Tony

    2001-01-01

    Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...

  8. Large-eddy simulation/probability density function modeling of local extinction and re-ignition in Sandia Flame E

    Science.gov (United States)

    Wang, Haifeng; Popov, Pavel; Hiremath, Varun; Lantz, Steven; Viswanathan, Sharadha; Pope, Stephen

    2010-11-01

    A large-eddy simulation (LES)/probability density function (PDF) code is developed and applied to the study of local extinction and re-ignition in Sandia Flame E. The modified Curl mixing model is used to account for the sub-filter scalar mixing; the ARM1 mechanism is used for the chemical reaction; and the in- situ adaptive tabulation (ISAT) algorithm is used to accelerate the chemistry calculations. Calculations are performed on different grids to study the resolution requirement for this flame. Then, with sufficient grid resolution, full-scale LES/PDF calculations are performed to study the flame characteristics and the turbulence-chemistry interactions. Sensitivity to the mixing frequency model is explored in order to understand the behavior of sub-filter scalar mixing in the context of LES. The simulation results are compared to the experimental data to demonstrate the capability of the code. Comparison is also made to previous RANS/PDF simulations.

  9. Modelling turbulence effects in wildland fire propagation by the randomized level-set method

    CERN Document Server

    Pagnini, Gianni

    2014-01-01

    Turbulence is of paramount importance in wildland fire propagation since it randomly transports the hot air mass that can pre-heat and then ignite the area ahead the fire. This contributes to give a random character to the firefront position together with other phenomena as for example fire spotting, vegetation distribution (patchiness), gaseous combustion fluctuation, small-scale terrain elevation changes. Here only turbulence is considered. The level-set method is used to numerically describe the evolution of the fireline contour that is assumed to have a random motion because of turbulence. The progression of the combustion process is then described by a level-set contour distributed according to a weight function given by the probability density function of the air particles in turbulent motion. From the comparison between the ordinary and the randomized level-set methods, it emerges that the proposed modelling approach turns out to be suitable to simulate a moving firefront fed by the ground fuel and dri...

  10. SU-E-T-144: Bayesian Inference of Local Relapse Data Using a Poisson-Based Tumour Control Probability Model

    Energy Technology Data Exchange (ETDEWEB)

    La Russa, D [The Ottawa Hospital Cancer Centre, Ottawa, ON (Canada)

    2015-06-15

    Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributions found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.

  11. HMM-ModE – Improved classification using profile hidden Markov models by optimising the discrimination threshold and modifying emission probabilities with negative training sequences

    Directory of Open Access Journals (Sweden)

    Nandi Soumyadeep

    2007-03-01

    Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors

  12. Deduction and Validation of an Eulerian-Eulerian Model for Turbulent Dilute Two-Phase Flows by Means of the Phase Indicator Function Disperse Elements Probability Density Function

    Institute of Scientific and Technical Information of China (English)

    SantiagoLain; RicardoAliod

    2000-01-01

    A statistical formalism overcoming some conceptual and practical difficulties arising in existing two-phase flow (2PHF) mathematical modelling has been applied to propose a model for dilute 2PHF turbulent flows.Phase interaction terms with a clear physical meaning enter the equations and the formalism provides some guidelines for the avoidance of closure assumptions or the rational approximation of these terms. Continuous phase averaged continuity, momentum, turbulent kinetic energy and turbulence dissipation rate equations have been rigorously and systematically obtained in a single step. These equations display a structure similar to that for single-phase flows.It is also assumed that dispersed phase dynamics is well described by a probability density function (pdf) equation and Eulerian continuity, momentum and fluctuating kinetic energy equations for the dispersed phase are deduced.An extension of the standard k-c turbulence model for the continuous phase is used. A gradient transport model is adopted for the dispersed phase fluctuating fluxes of momentum and kinetic energy at the non-colliding, large inertia limit. This model is then used to predict the behaviour of three axisymmetric turbulent jets of air laden with solid particles varying in size and concentration. Qualitative and quantitative numerical predictions compare reasonably well with the three different sets of experimental results, studying the influence of particle size, loading ratio and flow confinement velocity.

  13. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  14. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  15. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    Science.gov (United States)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  16. Setting the agenda: Different strategies of a Mass Media in a model of cultural dissemination

    Science.gov (United States)

    Pinto, Sebastián; Balenzuela, Pablo; Dorso, Claudio O.

    2016-09-01

    Day by day, people exchange opinions about news with relatives, friends, and coworkers. In most cases, they get informed about a given issue by reading newspapers, listening to the radio, or watching TV, i.e., through a Mass Media (MM). However, the importance of a given new can be stimulated by the Media by assigning newspaper's pages or time in TV programs. In this sense, we say that the Media has the power to "set the agenda", i.e., it decides which new is important and which is not. On the other hand, the Media can know people's concerns through, for instance, websites or blogs where they express their opinions, and then it can use this information in order to be more appealing to an increasing number of people. In this work, we study different scenarios in an agent-based model of cultural dissemination, in which a given Mass Media has a specific purpose: To set a particular topic of discussion and impose its point of view to as many social agents as it can. We model this by making the Media has a fixed feature, representing its point of view in the topic of discussion, while it tries to attract new consumers, by taking advantage of feedback mechanisms, represented by adaptive features. We explore different strategies that the Media can adopt in order to increase the affinity with potential consumers and then the probability to be successful in imposing this particular topic.

  17. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  18. Modeling probability and additive summation for detection across multiple mechanisms under the assumptions of signal detection theory.

    Science.gov (United States)

    Kingdom, Frederick A A; Baldwin, Alex S; Schmidtmann, Gunnar

    2015-01-01

    Many studies have investigated how multiple stimuli combine to reach threshold. There are broadly speaking two ways this can occur: additive summation (AS) where inputs from the different stimuli add together in a single mechanism, or probability summation (PS) where different stimuli are detected independently by separate mechanisms. PS is traditionally modeled under high threshold theory (HTT); however, tests have shown that HTT is incorrect and that signal detection theory (SDT) is the better framework for modeling summation. Modeling the equivalent of PS under SDT is, however, relatively complicated, leading many investigators to use Monte Carlo simulations for the predictions. We derive formulas that employ numerical integration to predict the proportion correct for detecting multiple stimuli assuming PS under SDT, for the situations in which stimuli are either equal or unequal in strength. Both formulas are general purpose, calculating performance for forced-choice tasks with M alternatives, n stimuli, in Q monitored mechanisms, each subject to a non-linear transducer with exponent τ. We show how the probability (and additive) summation formulas can be used to simulate psychometric functions, which when fitted with Weibull functions make signature predictions for how thresholds and psychometric function slopes vary as a function of τ, n, and Q. We also show how one can fit the formulas directly to real psychometric functions using data from a binocular summation experiment, and show how one can obtain estimates of τ and test whether binocular summation conforms more to PS or AS. The methods described here can be readily applied using software functions newly added to the Palamedes toolbox.

  19. Models of Music Therapy Intervention in School Settings

    Science.gov (United States)

    Wilson, Brian L., Ed.

    2002-01-01

    This completely revised 2nd edition edited by Brian L. Wilson, addresses both theoretical issues and practical applications of music therapy in educational settings. 17 chapters written by a variety of authors, each dealing with a different setting or issue. A valuable resource for demonstrating the efficacy of music therapy to school…

  20. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    Directory of Open Access Journals (Sweden)

    Tatsuhiko Sato

    Full Text Available We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells and Neo cells (neomycin resistant gene-expressing HeLa cells irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities.