WorldWideScience

Sample records for cumulative probability distributions

  1. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  2. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  3. Cumulative Poisson Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  4. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  5. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  7. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  8. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  9. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    Energy Technology Data Exchange (ETDEWEB)

    O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-27

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  10. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  11. The distribution function of a probability measure on a space with a fractal structure

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Granero, M.A.; Galvez-Rodriguez, J.F.

    2017-07-01

    In this work we show how to define a probability measure with the help of a fractal structure. One of the keys of this approach is to use the completion of the fractal structure. Then we use the theory of a cumulative distribution function on a Polish ultrametric space and describe it in this context. Finally, with the help of fractal structures, we prove that a function satisfying the properties of a cumulative distribution function on a Polish ultrametric space is a cumulative distribution function with respect to some probability measure on the space. (Author)

  12. On the probability distribution of the stochastic saturation scale in QCD

    International Nuclear Information System (INIS)

    Marquet, C.; Soyez, G.; Xiao Bowen

    2006-01-01

    It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations

  13. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  14. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  15. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    Science.gov (United States)

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We

  16. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  17. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    Science.gov (United States)

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  18. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  19. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  20. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  1. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  2. CDFTBL: A statistical program for generating cumulative distribution functions from data

    International Nuclear Information System (INIS)

    Eslinger, P.W.

    1991-06-01

    This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs

  3. Cumulative processes and quark distribution in nuclei

    International Nuclear Information System (INIS)

    Kondratyuk, L.; Shmatikov, M.

    1984-01-01

    Assuming existence of multiquark (mainly 12q) bags in nuclei the spectra of cumulative nucleons and mesons produced in high-energy particle-nucleus collisions are discussed. The exponential form of quark momentum distribution in 12q-bag (agreeing well with the experimental data on lepton-nucleus interactions at large q 2 ) is shown to result in quasi-exponential distribution of cumulative particles over the light-cone variable αsub(B). The dependence of f(αsub(B); psub(perpendicular)) (where psub(perpendicular) is the transverse momentum of the bag) upon psub(perpendicular) is considered. The yields of cumulative resonances as well as effects related to the u- and d-quark distributions in N > Z nuclei being different are dicscussed

  4. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  5. Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2017-01-01

    The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days

  6. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  7. Estimating the distribution of lifetime cumulative radon exposures for California residents: a brief summary

    International Nuclear Information System (INIS)

    Liu, K.-S.; Chang, Y.-L.; Hayward, S.B.; Gadgil, A.J.; Nero, A.V.

    1992-01-01

    Data on residential radon concentrations in California, together with information on California residents' moving houses and time-activity patterns, have been used to estimate the distribution of lifetime cumulative exposures to 222 Rn. This distribution was constructed using Monte Carlo techniques to simulate the lifetime occupancy histories and associated radon exposures of 10,000 California residents. For standard male and female lifespans, the simulation sampled from transition probability matrices representing changes of residence within and between six regions of California, as well as into and out of the other United States, and then sampled from the appropriate regional (or national) distribution of indoor concentrations. The resulting distribution of lifetime cumulative exposures has a significantly narrower relative width than the distribution of California indoor concentrations, with only a small fraction (less than 0.2%) of the population having lifetime exposures equivalent to living their lifetimes in a single home with a radon concentration of 148 Bq.m -3 or more. (author)

  8. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  9. Simulation of Daily Weather Data Using Theoretical Probability Distributions.

    Science.gov (United States)

    Bruhn, J. A.; Fry, W. E.; Fick, G. W.

    1980-09-01

    A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previous day. Both minimum relative humidity and total solar radiation are assumed to be normally distributed. The values of the parameters describing the distribution of minimum relative humidity is dependent on rainfall occurrence on the previous day and current day. Parameter values for total solar radiation are dependent on the occurrence of rain on the current day. The assumptions made during model construction were found to be appropriate for actual weather data from Geneva, New York. The performance of the weather model was evaluated by comparing the cumulative frequency distributions of simulated weather data with the distributions of actual weather data from Geneva, New York and Fort Collins, Colorado. For each location, simulated weather data were similar to actual weather data in terms of mean response, variability and autocorrelation. The possible applications of this model when used with models of other components of the agro-ecosystem are discussed.

  10. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  11. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS

    KAUST Repository

    Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.

    2012-01-01

    The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.

  12. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  13. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  14. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  15. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  16. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  17. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  18. Cumulative Clearness Index Frequency Distributions on the Territory of the Russian Federation

    Science.gov (United States)

    Frid, S. E.; Lisitskaya, N. V.; Popel, O. S.

    2018-02-01

    Cumulative distributions of clearness index values are constructed for the territory of Russia based on ground observation results and NASA POWER data. The obtained distributions lie close to each other, which means that the NASA POWER data can be used in solar power installations simulation at temperate and high latitudes. Approximation of the obtained distributions is carried out. The values of equation coefficients for the cumulative clearness index distributions constructed for a wide range of climatic conditions are determined. Equations proposed for a tropical climate are used in the calculations, so they can be regarded as universal ones.

  19. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    Science.gov (United States)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  20. Evaluation of a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Imae, Toshikazu; Takenaka, Shigeharu; Saotome, Naoya

    2016-01-01

    The purpose of this study was to evaluate a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy (SBRT) using volumetric modulated arc therapy (VMAT). VMAT is capable of acquiring respiratory signals derived from projection images and machine parameters based on machine logs during VMAT delivery. Dose distributions were reconstructed from the respiratory signals and machine parameters in the condition where respiratory signals were without division, divided into 4 and 10 phases. The dose distribution of each respiratory phase was calculated on the planned four-dimensional CT (4DCT). Summation of the dose distributions was carried out using deformable image registration (DIR), and cumulative dose distributions were compared with those of the corresponding plans. Without division, dose differences between cumulative distribution and plan were not significant. In the condition Where respiratory signals were divided, dose differences were observed over dose in cranial region and under dose in caudal region of planning target volume (PTV). Differences between 4 and 10 phases were not significant. The present method Was feasible for evaluating cumulative dose distribution in VMAT-SBRT using 4DCT and DIR. (author)

  1. Transmuted Lindley-Geometric Distribution and its applications

    OpenAIRE

    Merovci, Faton; Elbatal, Ibrahim

    2013-01-01

    A functional composition of the cumulative distribution function of one probability distribution with the inverse cumulative distribution function of another is called the transmutation map. In this article, we will use the quadratic rank transmutation map (QRTM) in order to generate a flexible family of probability distributions taking Lindley geometric distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. It will be show...

  2. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  3. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  4. Baseline for the cumulants of net-proton distributions at STAR

    International Nuclear Information System (INIS)

    Luo, Xiaofeng; Mohanty, Bedangadas; Xu, Nu

    2014-01-01

    We present a systematic comparison between the recently measured cumulants of the net-proton distributions by STAR for 0–5% central Au + Au collisions at √(s NN )=7.7–200 GeV and two kinds of possible baseline measure, the Poisson and Binomial baselines. These baseline measures are assuming that the proton and anti-proton distributions independently follow Poisson statistics or Binomial statistics. The higher order cumulant net-proton data are observed to deviate from all the baseline measures studied at 19.6 and 27 GeV. We also compare the net-proton with net-baryon fluctuations in UrQMD and AMPT model, and convert the net-proton fluctuations to net-baryon fluctuations in AMPT model by using a set of formula

  5. Newton/Poisson-Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.

    1990-01-01

    NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

  6. Decision analysis with cumulative prospect theory.

    Science.gov (United States)

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  7. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  8. Joint Probability Distributions for a Class of Non-Markovian Processes

    OpenAIRE

    Baule, A.; Friedrich, R.

    2004-01-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H.C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single time probability distributions to the case of N-time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fr...

  9. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  10. Joint probability distributions for a class of non-Markovian processes.

    Science.gov (United States)

    Baule, A; Friedrich, R

    2005-02-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.

  11. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  12. Some applications of the fractional Poisson probability distribution

    International Nuclear Information System (INIS)

    Laskin, Nick

    2009-01-01

    Physical and mathematical applications of the recently invented fractional Poisson probability distribution have been presented. As a physical application, a new family of quantum coherent states has been introduced and studied. As mathematical applications, we have developed the fractional generalization of Bell polynomials, Bell numbers, and Stirling numbers of the second kind. The appearance of fractional Bell polynomials is natural if one evaluates the diagonal matrix element of the evolution operator in the basis of newly introduced quantum coherent states. Fractional Stirling numbers of the second kind have been introduced and applied to evaluate the skewness and kurtosis of the fractional Poisson probability distribution function. A representation of the Bernoulli numbers in terms of fractional Stirling numbers of the second kind has been found. In the limit case when the fractional Poisson probability distribution becomes the Poisson probability distribution, all of the above listed developments and implementations turn into the well-known results of the quantum optics and the theory of combinatorial numbers.

  13. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  14. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  15. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  16. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  17. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  18. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  19. Tumour control probability (TCP) for non-uniform activity distribution in radionuclide therapy

    International Nuclear Information System (INIS)

    Uusijaervi, Helena; Bernhardt, Peter; Forssell-Aronsson, Eva

    2008-01-01

    Non-uniform radionuclide distribution in tumours will lead to a non-uniform absorbed dose. The aim of this study was to investigate how tumour control probability (TCP) depends on the radionuclide distribution in the tumour, both macroscopically and at the subcellular level. The absorbed dose in the cell nuclei of tumours was calculated for 90 Y, 177 Lu, 103m Rh and 211 At. The radionuclides were uniformly distributed within the subcellular compartment and they were uniformly, normally or log-normally distributed among the cells in the tumour. When all cells contain the same amount of activity, the cumulated activities required for TCP = 0.99 (A-tilde TCP=0.99 ) were 1.5-2 and 2-3 times higher when the activity was distributed on the cell membrane compared to in the cell nucleus for 103m Rh and 211 At, respectively. TCP for 90 Y was not affected by different radionuclide distributions, whereas for 177 Lu, it was slightly affected when the radionuclide was in the nucleus. TCP for 103m Rh and 211 At were affected by different radionuclide distributions to a great extent when the radionuclides were in the cell nucleus and to lesser extents when the radionuclides were distributed on the cell membrane or in the cytoplasm. When the activity was distributed in the nucleus, A-tilde TCP=0.99 increased when the activity distribution became more heterogeneous for 103m Rh and 211 At, and the increase was large when the activity was normally distributed compared to log-normally distributed. When the activity was distributed on the cell membrane, A-tilde TCP=0.99 was not affected for 103m Rh and 211 At when the activity distribution became more heterogeneous. A-tilde TCP=0.99 for 90 Y and 177 Lu were not affected by different activity distributions, neither macroscopic nor subcellular

  20. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  1. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  2. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  3. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  4. A MULTIVARIATE WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Cheng Lee

    2010-07-01

    Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.

  5. A technique to obtain a multiparameter radar rainfall algorithm using the probability matching procedure

    International Nuclear Information System (INIS)

    Gorgucci, E.; Scarchilli, G.

    1997-01-01

    The natural cumulative distributions of rainfall observed by a network of rain gauges and a multiparameter radar are matched to derive multiparameter radar algorithms for rainfall estimation. The use of multiparameter radar measurements in a statistical framework to estimate rainfall is resented in this paper, The techniques developed in this paper are applied to the radar and rain gauge measurement of rainfall observed in central Florida and central Italy. Conventional pointwise estimates of rainfall are also compared. The probability matching procedure, when applied to the radar and surface measurements, shows that multiparameter radar algorithms can match the probability distribution function better than the reflectivity-based algorithms. It is also shown that the multiparameter radar algorithm derived matching the cumulative distribution function of rainfall provides more accurate estimates of rainfall on the ground in comparison to any conventional reflectivity-based algorithm

  6. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    Science.gov (United States)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  7. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

  8. Computation of Probabilities in Causal Models of History of Science

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2006-12-01

    Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.

  9. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  10. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  11. Expansion formulae for characteristics of cumulative cost in finite horizon production models

    NARCIS (Netherlands)

    Ayhan, H.; Schlegel, S.

    2001-01-01

    We consider the expected value and the tail probability of cumulative shortage and holding cost (i.e. the probability that cumulative cost is more than a certain value) in finite horizon production models. An exact expression is provided for the expected value of the cumulative cost for general

  12. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Here we derive the most probable degree distribution emerging ... the structural entropy of power-law networks is an increasing function of the expo- .... tition function Z of the network as the sum over all degree distributions, with given energy.

  13. GammaCHI: a package for the inversion and computation of the gamma and chi-square cumulative distribution functions (central and noncentral)

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2015-01-01

    textabstractA Fortran 90 module GammaCHI for computing and inverting the gamma and chi-square cumulative distribution functions (central and noncentral) is presented. The main novelty of this package is the reliable and accurate inversion routines for the noncentral cumulative distribution

  14. Geometry of q-Exponential Family of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Shun-ichi Amari

    2011-06-01

    Full Text Available The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability estimator.

  15. Comparing the ISO-recommended and the cumulative data-reduction algorithms in S-on-1 laser damage test by a reverse approach method

    Science.gov (United States)

    Zorila, Alexandru; Stratan, Aurel; Nemes, George

    2018-01-01

    We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  18. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  19. CDF-XL: computing cumulative distribution functions of reaction time data in Excel.

    Science.gov (United States)

    Houghton, George; Grange, James A

    2011-12-01

    In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.

  20. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  1. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  2. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  3. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  4. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  5. LAW DISTRIBUTION APPROXIMATION ON EIGENSTATE ERRORS OF ADS-B BASED ON CUMULANT ANALYSIS OF ADS-B-RAD SYSTEM DATA DISPARITY

    Directory of Open Access Journals (Sweden)

    2017-01-01

    Full Text Available The article deals with a new approximation method for enhanced accuracy measurement system errors distribu- tion. The method is based upon the mistie analysis of this system and a more robust design data. The method is considered on the example of comparison of Automatic Dependent Surveillance - Broadcast (ADS-B with ground radar warning sys- tem used at present. The peculiarity of the considered problem is that the target parameter (aircraft swerve value may dras- tically change in the scale of both measurement systems errors during observation. That is why it is impossible to determine the position of the aircraft by repeatedly observing it with ground radar warning system. It is only possible to compare the systems’ one-shot measurements, which are called errors here. The article considers that the distribution of robust meas- urement system errors probability density (the system that has been continuously in operation is known, the histogram of errors is given and it is needed to obtain an asymptotic estimate of errors occurrence distribution for a new improved meas- urement system.This approach is based on cumulant analysis of measurement systems error distribution functions. The approach allows us to carry out the reduction of corresponding infinite series properly. The author shows that due to measurement systems independency, their errors distribution cumulants are connected by a simple ratio, which allow to calculate the val- ues easily. To reconstruct distribution initial form one should use Edgeworth’s asymptotic series, where a normal distribu- tion derivative is used as a basis function. The latter is proportional to Hermitian polynomial, thus the series can be consid- ered as an orthogonal decomposition.The author reveals the results of coordinate error component distribution calculation; the error is measured when the normal line lies towards aircraft path, using error statistics experimental information obtained in ”RI of

  6. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi

    2010-11-15

    Bubble-nucleation processes of a Lennard-Jones liquid are studied by molecular dynamics simulations. Waiting time, which is the lifetime of a superheated liquid, is determined for several system sizes, and the apparent finite-size effect of the nucleation rate is observed. From the cumulative distribution function of the nucleation events, the bubble-nucleation process is found to be not a simple Poisson process but a Poisson process with an additional relaxation time. The parameters of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows to the critical size leads to an incorrect estimation of the nucleation rate. © 2010 The American Physical Society.

  7. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  8. Application of approximations for joint cumulative k-distributions for mixtures to FSK radiation heat transfer in multi-component high temperature non-LTE plasmas

    International Nuclear Information System (INIS)

    Maurente, André; França, Francis H.R.; Miki, Kenji; Howell, John R.

    2012-01-01

    Approximations for joint cumulative k-distribution for mixtures are efficient for full spectrum k-distribution (FSK) computations. These approximations provide reduction of the database that is necessary to perform FSK computation when compared to the direct approach, which uses cumulative k-distributions computed from the spectrum of the mixture, and also less computational expensive when compared to techniques in which RTE's are required to be solved for each component of the mixture. The aim of the present paper is to extend the approximations for joint cumulative k-distributions for non-LTE media. For doing that, a FSK to non-LTE media formulation well-suited to be applied along with approximations for joint cumulative k-distributions is presented. The application of the proposed methodology is demonstrated by solving the radiation heat transfer in non-LTE high temperature plasmas composed of N, O, N 2 , NO, N 2 + and mixtures of these species. The two more efficient approximations, that is, the superposition and multiplication are employed and analyzed.

  9. Evaluation of burst probability for tubes by Weibull distributions

    International Nuclear Information System (INIS)

    Kao, S.

    1975-10-01

    The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities

  10. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  11. About the cumulants of periodic signals

    Science.gov (United States)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  12. Galactic Subsystems on the Basis of Cumulative Distribution of Space Velocities

    Directory of Open Access Journals (Sweden)

    Vidojević, S.

    2008-12-01

    Full Text Available A sample containing $4,614$ stars with available space velocities and high-quality kinematical data from the Arihip Catalogue is formed. For the purpose of distinguishing galactic subsystems the cumulative distribution of space velocities is studied. The fractions of the three subsystems are found to be: thin disc 92\\%, thick disc 6\\% and halo 2\\%. These results are verified by analysing the elements of velocity ellipsoids and the shape and size of the galactocentric orbits of the sample stars, i.e. the planar and vertical eccentricities of the orbits.

  13. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  14. A Monte Carlo procedure for the construction of complementary cumulative distribution functions for comparison with the EPA release limits for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Shiver, A.W.

    1994-10-01

    A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the US Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with stratified sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over stratified sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction.

  15. A Monte Carlo procedure for the construction of complementary cumulative distribution functions for comparison with the EPA release limits for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.; Shiver, A.W.

    1994-10-01

    A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the US Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with stratified sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over stratified sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction

  16. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  17. Fractional Fokker-Planck equation and oscillatory behavior of cumulant moments

    International Nuclear Information System (INIS)

    Suzuki, N.; Biyajima, M.

    2002-01-01

    The Fokker-Planck equation is considered, which is connected to the birth and death process with immigration by the Poisson transform. The fractional derivative in time variable is introduced into the Fokker-Planck equation in order to investigate an origin of oscillatory behavior of cumulant moments. From its solution (the probability density function), the generating function (GF) for the corresponding probability distribution is derived. We consider the case when the GF reduces to that of the negative binomial distribution (NBD), if the fractional derivative is replaced to the ordinary one. The H j moment derived from the GF of the NBD decreases monotonically as the rank j increases. However, the H j moment derived in our approach oscillates, which is contrasted with the case of the NBD. Calculated H j moments are compared with those of charged multiplicities observed in pp-bar, e + e - , and e + p collisions. A phenomenological meaning of introducing the fractional derivative in time variable is discussed

  18. R Programs for Truncated Distributions

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2006-08-01

    Full Text Available Truncated distributions arise naturally in many practical situations. In this note, we provide programs for computing six quantities of interest (probability density function, mean, variance, cumulative distribution function, quantile function and random numbers for any truncated distribution: whether it is left truncated, right truncated or doubly truncated. The programs are written in R: a freely downloadable statistical software.

  19. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  20. Feynman quasi probability distribution for spin-(1/2), and its generalizations

    International Nuclear Information System (INIS)

    Colucci, M.

    1999-01-01

    It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects

  1. Influence of dose distribution homogeneity on the tumor control probability in heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan

    2001-01-01

    In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%

  2. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  3. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  4. Superthermal photon bunching in terms of simple probability distributions

    Science.gov (United States)

    Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.

    2018-05-01

    We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.

  5. Generalization of Poisson distribution for the case of changing probability of consequential events

    International Nuclear Information System (INIS)

    Kushnirenko, E.

    1995-01-01

    The generalization of the Poisson distribution for the case of changing probabilities of the consequential events is done. It is shown that the classical Poisson distribution is the special case of this generalized distribution when the probabilities of the consequential events are constant. The using of the generalized Poisson distribution gives the possibility in some cases to obtain analytical result instead of making Monte-Carlo calculation

  6. Family of probability distributions derived from maximal entropy principle with scale invariant restrictions.

    Science.gov (United States)

    Sonnino, Giorgio; Steinbrecher, György; Cardinali, Alessandro; Sonnino, Alberto; Tlidi, Mustapha

    2013-01-01

    Using statistical thermodynamics, we derive a general expression of the stationary probability distribution for thermodynamic systems driven out of equilibrium by several thermodynamic forces. The local equilibrium is defined by imposing the minimum entropy production and the maximum entropy principle under the scale invariance restrictions. The obtained probability distribution presents a singularity that has immediate physical interpretation in terms of the intermittency models. The derived reference probability distribution function is interpreted as time and ensemble average of the real physical one. A generic family of stochastic processes describing noise-driven intermittency, where the stationary density distribution coincides exactly with the one resulted from entropy maximization, is presented.

  7. Study on probability distribution of fire scenarios in risk assessment to emergency evacuation

    International Nuclear Information System (INIS)

    Chu Guanquan; Wang Jinhui

    2012-01-01

    Event tree analysis (ETA) is a frequently-used technique to analyze the probability of probable fire scenario. The event probability is usually characterized by definite value. It is not appropriate to use definite value as these estimates may be the result of poor quality statistics and limited knowledge. Without addressing uncertainties, ETA will give imprecise results. The credibility of risk assessment will be undermined. This paper presents an approach to address event probability uncertainties and analyze probability distribution of probable fire scenario. ETA is performed to construct probable fire scenarios. The activation time of every event is characterized as stochastic variable by considering uncertainties of fire growth rate and other input variables. To obtain probability distribution of probable fire scenario, Markov Chain is proposed to combine with ETA. To demonstrate the approach, a case study is presented.

  8. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  9. Externally studentized normal midrange distribution

    Directory of Open Access Journals (Sweden)

    Ben Dêivide de Oliveira Batista

    Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.

  10. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    International Nuclear Information System (INIS)

    Huang Zhifu; Lin Bihong; ChenJincan

    2009-01-01

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  11. Collective motions of globally coupled oscillators and some probability distributions on circle

    Energy Technology Data Exchange (ETDEWEB)

    Jaćimović, Vladimir [Faculty of Natural Sciences and Mathematics, University of Montenegro, Cetinjski put, bb., 81000 Podgorica (Montenegro); Crnkić, Aladin, E-mail: aladin.crnkic@hotmail.com [Faculty of Technical Engineering, University of Bihać, Ljubijankićeva, bb., 77000 Bihać, Bosnia and Herzegovina (Bosnia and Herzegovina)

    2017-06-28

    In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.

  12. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  13. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximate...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... such as the renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation...

  14. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  15. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  16. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  17. Probability Distribution and Deviation Information Fusion Driven Support Vector Regression Model and Its Application

    Directory of Open Access Journals (Sweden)

    Changhao Fan

    2017-01-01

    Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.

  18. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... are thoroughly discussed in the case of rectangular representation of uncertainty by the uniform probability distribution and the interval, respectively. Also triangular representations are dealt with and compared. Calculation of monotonic as well as non-monotonic functions of variables represented...

  19. WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Smagin

    2016-12-01

    Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.

  20. Probability distribution of long-run indiscriminate felling of trees in ...

    African Journals Online (AJOL)

    The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...

  1. Theoretical derivation of wind power probability distribution function and applications

    International Nuclear Information System (INIS)

    Altunkaynak, Abdüsselam; Erdik, Tarkan; Dabanlı, İsmail; Şen, Zekai

    2012-01-01

    Highlights: ► Derivation of wind power stochastic characteristics are standard deviation and the dimensionless skewness. ► The perturbation is expressions for the wind power statistics from Weibull probability distribution function (PDF). ► Comparisons with the corresponding characteristics of wind speed PDF abides by the Weibull PDF. ► The wind power abides with the Weibull-PDF. -- Abstract: The instantaneous wind power contained in the air current is directly proportional with the cube of the wind speed. In practice, there is a record of wind speeds in the form of a time series. It is, therefore, necessary to develop a formulation that takes into consideration the statistical parameters of such a time series. The purpose of this paper is to derive the general wind power formulation in terms of the statistical parameters by using the perturbation theory, which leads to a general formulation of the wind power expectation and other statistical parameter expressions such as the standard deviation and the coefficient of variation. The formulation is very general and can be applied specifically for any wind speed probability distribution function. Its application to two-parameter Weibull probability distribution of wind speeds is presented in full detail. It is concluded that provided wind speed is distributed according to a Weibull distribution, the wind power could be derived based on wind speed data. It is possible to determine wind power at any desired risk level, however, in practical studies most often 5% or 10% risk levels are preferred and the necessary simple procedure is presented for this purpose in this paper.

  2. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  3. Numerical Loading of a Maxwellian Probability Distribution Function

    International Nuclear Information System (INIS)

    Lewandowski, J.L.V.

    2003-01-01

    A renormalization procedure for the numerical loading of a Maxwellian probability distribution function (PDF) is formulated. The procedure, which involves the solution of three coupled nonlinear equations, yields a numerically loaded PDF with improved properties for higher velocity moments. This method is particularly useful for low-noise particle-in-cell simulations with electron dynamics

  4. Calculation of magnetization curves and probability distribution for monoclinic and uniaxial systems

    International Nuclear Information System (INIS)

    Sobh, Hala A.; Aly, Samy H.; Yehia, Sherif

    2013-01-01

    We present the application of a simple classical statistical mechanics-based model to selected monoclinic and hexagonal model systems. In this model, we treat the magnetization as a classical vector whose angular orientation is dictated by the laws of equilibrium classical statistical mechanics. We calculate for these anisotropic systems, the magnetization curves, energy landscapes and probability distribution for different sets of relevant parameters and magnetic fields of different strengths and directions. Our results demonstrate a correlation between the most probable orientation of the magnetization vector, the system's parameters, and the external magnetic field. -- Highlights: ► We calculate magnetization curves and probability angular distribution of the magnetization. ► The magnetization curves are consistent with probability results for the studied systems. ► Monoclinic and hexagonal systems behave differently due to their different anisotropies

  5. Quantum Fourier transform, Heisenberg groups and quasi-probability distributions

    International Nuclear Information System (INIS)

    Patra, Manas K; Braunstein, Samuel L

    2011-01-01

    This paper aims to explore the inherent connection between Heisenberg groups, quantum Fourier transform (QFT) and (quasi-probability) distribution functions. Distribution functions for continuous and finite quantum systems are examined from three perspectives and all of them lead to Weyl-Gabor-Heisenberg groups. The QFT appears as the intertwining operator of two equivalent representations arising out of an automorphism of the group. Distribution functions correspond to certain distinguished sets in the group algebra. The marginal properties of a particular class of distribution functions (Wigner distributions) arise from a class of automorphisms of the group algebra of the Heisenberg group. We then study the reconstruction of the Wigner function from the marginal distributions via inverse Radon transform giving explicit formulae. We consider some applications of our approach to quantum information processing and quantum process tomography.

  6. New family of probability distributions with applications to Monte Carlo studies

    International Nuclear Information System (INIS)

    Johnson, M.E.; Tietjen, G.L.; Beckman, R.J.

    1980-01-01

    A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution

  7. Simulation by the method of inverse cumulative distribution function applied in optimising of foundry plant production

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2009-01-01

    Full Text Available The study discusses application of computer simulation based on the method of inverse cumulative distribution function. The simulationrefers to an elementary static case, which can also be solved by physical experiment, consisting mainly in observations of foundryproduction in a selected foundry plant. For the simulation and forecasting of foundry production quality in selected cast iron grade, arandom number generator of Excel calculation sheet was chosen. Very wide potentials of this type of simulation when applied to theevaluation of foundry production quality were demonstrated, using a number generator of even distribution for generation of a variable ofan arbitrary distribution, especially of a preset empirical distribution, without any need of adjusting to this variable the smooth theoreticaldistributions.

  8. Precipitation intensity probability distribution modelling for hydrological and construction design purposes

    International Nuclear Information System (INIS)

    Koshinchanov, Georgy; Dimitrov, Dobri

    2008-01-01

    The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: - The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; - As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. - Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; - Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; - Next method is considering only

  9. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  10. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  11. Bounds for the probability distribution function of the linear ACD process

    OpenAIRE

    Fernandes, Marcelo

    2003-01-01

    Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

  12. Gravitational wave chirp search: no-signal cumulative distribution of the maximum likelihood detection statistic

    International Nuclear Information System (INIS)

    Croce, R P; Demma, Th; Longo, M; Marano, S; Matta, V; Pierro, V; Pinto, I M

    2003-01-01

    The cumulative distribution of the supremum of a set (bank) of correlators is investigated in the context of maximum likelihood detection of gravitational wave chirps from coalescing binaries with unknown parameters. Accurate (lower-bound) approximants are introduced based on a suitable generalization of previous results by Mohanty. Asymptotic properties (in the limit where the number of correlators goes to infinity) are highlighted. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian correlation inequality

  13. On Selection of the Probability Distribution for Representing the Maximum Annual Wind Speed in East Cairo, Egypt

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh. I.; El-Hemamy, S.T.

    2013-01-01

    The main objective of this paper is to identify an appropriate probability model and best plotting position formula which represent the maximum annual wind speed in east Cairo. This model can be used to estimate the extreme wind speed and return period at a particular site as well as to determine the radioactive release distribution in case of accident occurrence at a nuclear power plant. Wind speed probabilities can be estimated by using probability distributions. An accurate determination of probability distribution for maximum wind speed data is very important in expecting the extreme value . The probability plots of the maximum annual wind speed (MAWS) in east Cairo are fitted to six major statistical distributions namely: Gumbel, Weibull, Normal, Log-Normal, Logistic and Log- Logistic distribution, while eight plotting positions of Hosking and Wallis, Hazen, Gringorten, Cunnane, Blom, Filliben, Benard and Weibull are used for determining exceedance of their probabilities. A proper probability distribution for representing the MAWS is selected by the statistical test criteria in frequency analysis. Therefore, the best plotting position formula which can be used to select appropriate probability model representing the MAWS data must be determined. The statistical test criteria which represented in: the probability plot correlation coefficient (PPCC), the root mean square error (RMSE), the relative root mean square error (RRMSE) and the maximum absolute error (MAE) are used to select the appropriate probability position and distribution. The data obtained show that the maximum annual wind speed in east Cairo vary from 44.3 Km/h to 96.1 Km/h within duration of 39 years . Weibull plotting position combined with Normal distribution gave the highest fit, most reliable, accurate predictions and determination of the wind speed in the study area having the highest value of PPCC and lowest values of RMSE, RRMSE and MAE

  14. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  15. Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence

    Directory of Open Access Journals (Sweden)

    C. C. Wu

    2011-04-01

    Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.

  16. Probability distributions for first neighbor distances between resonances that belong to two different families

    International Nuclear Information System (INIS)

    Difilippo, F.C.

    1994-01-01

    For a mixture of two families of resonances, we found the probability distribution for the distance, as first neighbors, between resonances that belong to different families. Integration of this distribution gives the probability of accidental overlapping of resonances of one isotope by resonances of the other, provided that the resonances of each isotope belong to a single family. (author)

  17. Diachronic changes in word probability distributions in daily press

    Directory of Open Access Journals (Sweden)

    Stanković Jelena

    2006-01-01

    Full Text Available Changes in probability distributions of individual words and word types were investigated within two samples of daily press in the span of fifty years. Two samples of daily press were used in this study. The one derived from the Corpus of Serbian Language (CSL /Kostić, Đ., 2001/ that covers period between 1945. and 1957. and the other derived from the Ebart Media Documentation (EBR that was complied from seven daily news and five weekly magazines from 2002. and 2003. Each sample consisted of about 1 million words. The obtained results indicate that nouns and adjectives were more frequent in the CSL, while verbs and prepositions are more frequent in the EBR sample, suggesting a decrease of sentence length in the last five decades. Conspicuous changes in probability distribution of individual words were observed for nouns and adjectives, while minimal or no changes were observed for verbs and prepositions. Such an outcome suggests that nouns and adjectives are most susceptible to diachronic changes, while verbs and prepositions appear to be resistant to such changes.

  18. Probability distributions in conservative energy exchange models of multiple interacting agents

    International Nuclear Information System (INIS)

    Scafetta, Nicola; West, Bruce J

    2007-01-01

    Herein we study energy exchange models of multiple interacting agents that conserve energy in each interaction. The models differ regarding the rules that regulate the energy exchange and boundary effects. We find a variety of stochastic behaviours that manifest energy equilibrium probability distributions of different types and interaction rules that yield not only the exponential distributions such as the familiar Maxwell-Boltzmann-Gibbs distribution of an elastically colliding ideal particle gas, but also uniform distributions, truncated exponential distributions, Gaussian distributions, Gamma distributions, inverse power law distributions, mixed exponential and inverse power law distributions, and evolving distributions. This wide variety of distributions should be of value in determining the underlying mechanisms generating the statistical properties of complex phenomena including those to be found in complex chemical reactions

  19. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    Science.gov (United States)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  20. Determination of 114Pd cumulative yield and investigation of the fine-structure at light peak in mass distribution of 252Cf spontaneous fission

    International Nuclear Information System (INIS)

    Yu Runlan; Li Xueliang; Cui Anzhi; Guo Jingru; Yan Shuhen; Tang Peijia; Liu Daming

    1991-07-01

    A rapid radiochemical procedure for Pd separation was developed. It was the first time to use radiochemical techniques to determine 114 Pd cumulative yield (2.50 ± 0.14)% in 252 Cf spontaneous fission. The cumulative yields of (3.50 ± 0.13)% and (3.70 ± 0.11)% for 112 Pd and 113g Ag were also obtained. These are in agreement with Skovorodkin's results. The cumulative yields determined show that there is a fine-structure at light peak of mass number A = 113 in the mass distribution of 252 Cf spontaneous fission

  1. The joint probability distribution of structure factors incorporating anomalous-scattering and isomorphous-replacement data

    International Nuclear Information System (INIS)

    Peschar, R.; Schenk, H.

    1991-01-01

    A method to derive joint probability distributions of structure factors is presented which incorporates anomalous-scattering and isomorphous-replacement data in a unified procedure. The structure factors F H and F -H , whose magnitudes are different due to anomalous scattering, are shown to be isomorphously related. This leads to a definition of isomorphism by means of which isomorphous-replacement and anomalous-scattering data can be handled simultaneously. The definition and calculation of the general term of the joint probability distribution for isomorphous structure factors turns out to be crucial. Its analytical form leads to an algorithm by means of which any particular joint probability distribution of structure factors can be constructed. The calculation of the general term is discussed for the case of four isomorphous structure factors in P1, assuming the atoms to be independently and uniformly distributed. A main result is the construction of the probability distribution of the 64 triplet phase sums present in space group P1 amongst four isomorphous structure factors F H , four isomorphous F K and four isomorphous F -H-K . The procedure is readily generalized in the case where an arbitrary number of isomorphous structure factors are available for F H , F K and F -H-K . (orig.)

  2. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  3. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    International Nuclear Information System (INIS)

    Marshman, Emily; Singh, Chandralekha

    2017-01-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations. (paper)

  4. Long-Term Fatigue and Its Probability of Failure Applied to Dental Implants

    Directory of Open Access Journals (Sweden)

    María Prados-Privado

    2016-01-01

    Full Text Available It is well known that dental implants have a high success rate but even so, there are a lot of factors that can cause dental implants failure. Fatigue is very sensitive to many variables involved in this phenomenon. This paper takes a close look at fatigue analysis and explains a new method to study fatigue from a probabilistic point of view, based on a cumulative damage model and probabilistic finite elements, with the goal of obtaining the expected life and the probability of failure. Two different dental implants were analysed. The model simulated a load of 178 N applied with an angle of 0°, 15°, and 20° and a force of 489 N with the same angles. Von Mises stress distribution was evaluated and once the methodology proposed here was used, the statistic of the fatigue life and the probability cumulative function were obtained. This function allows us to relate each cycle life with its probability of failure. Cylindrical implant has a worst behaviour under the same loading force compared to the conical implant analysed here. Methodology employed in the present study provides very accuracy results because all possible uncertainties have been taken in mind from the beginning.

  5. Cumulative human impacts on marine predators.

    Science.gov (United States)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J; Halpern, Benjamin S; Breed, Greg A; Nickel, Barry; Teutschel, Nicole M; Crowder, Larry B; Benson, Scott; Dutton, Peter H; Bailey, Helen; Kappes, Michelle A; Kuhn, Carey E; Weise, Michael J; Mate, Bruce; Shaffer, Scott A; Hassrick, Jason L; Henry, Robert W; Irvine, Ladd; McDonald, Birgitte I; Robinson, Patrick W; Block, Barbara A; Costa, Daniel P

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact (CUI) on marine predators by combining electronic tracking data of eight protected predator species (n=685 individuals) in the California Current Ecosystem with data on 24 anthropogenic stressors. We show significant variation in CUI with some of the highest impacts within US National Marine Sanctuaries. High variation in underlying species and cumulative impact distributions means that neither alone is sufficient for effective spatial management. Instead, comprehensive management approaches accounting for both cumulative human impacts and trade-offs among multiple stressors must be applied in planning the use of marine resources.

  6. Multimode Interference: Identifying Channels and Ridges in Quantum Probability Distributions

    OpenAIRE

    O'Connell, Ross C.; Loinaz, Will

    2004-01-01

    The multimode interference technique is a simple way to study the interference patterns found in many quantum probability distributions. We demonstrate that this analysis not only explains the existence of so-called "quantum carpets," but can explain the spatial distribution of channels and ridges in the carpets. With an understanding of the factors that govern these channels and ridges we have a limited ability to produce a particular pattern of channels and ridges by carefully choosing the ...

  7. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  8. Detecting spatial patterns with the cumulant function – Part 1: The theory

    Directory of Open Access Journals (Sweden)

    P. Naveau

    2008-02-01

    Full Text Available In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA, or equivalently a Empirical Orthogonal Functions (EOF decomposition, is often applied for this purpose, it provides meaningful results only if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments, and the covariance matrix captures the full dependence structure of multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data, alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint, that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. The cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. Three families of multivariate random vectors, for which explicit computations are obtained, are implemented to illustrate our approach. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.

  9. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  10. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  11. EXAFS cumulants of CdSe

    International Nuclear Information System (INIS)

    Diop, D.

    1997-04-01

    EXAFS functions had been extracted from measurements on the K edge of Se at different temperatures between 20 and 300 K. The analysis of the EXAFS of the filtered first two shells has been done in the wavevector range laying between 2 and 15.5 A -1 in terms of the cumulants of the effective distribution of distances. The cumulants C 3 and C 4 obtained from the phase difference and the amplitude ratio methods have shown the anharmonicity in the vibrations of atoms around their equilibrium position. (author). 13 refs, 3 figs

  12. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham; Alouini, Mohamed-Slim

    2017-01-01

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP

  13. The p-sphere and the geometric substratum of power-law probability distributions

    International Nuclear Information System (INIS)

    Vignat, C.; Plastino, A.

    2005-01-01

    Links between power law probability distributions and marginal distributions of uniform laws on p-spheres in R n show that a mathematical derivation of the Boltzmann-Gibbs distribution necessarily passes through power law ones. Results are also given that link parameters p and n to the value of the non-extensivity parameter q that characterizes these power laws in the context of non-extensive statistics

  14. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  15. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  16. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  17. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    DEFF Research Database (Denmark)

    Helles, Glennie; Fonseca, Rasmus

    2009-01-01

    residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...... done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results: In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle...

  18. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  19. Efficient Outage Probability Evaluation of Diversity Receivers Over Generalized Gamma Channels

    KAUST Repository

    Ben Issaid, Chaouki

    2016-10-01

    In this paper, we are interested in determining the cumulative distribution function of the sum of generalized Gamma in the setting of rare event simulations. To this end, we present an efficient importance sampling estimator. The main result of this work is the bounded relative property of the proposed estimator. This result is used to accurately estimate the outage probability of multibranch maximum ratio combining and equal gain combining diversity receivers over generalized Gamma fading channels. Selected numerical simulations are discussed to show the robustness of our estimator compared to naive Monte Carlo.

  20. Efficient Outage Probability Evaluation of Diversity Receivers Over Generalized Gamma Channels

    KAUST Repository

    Ben Issaid, Chaouki; Alouini, Mohamed-Slim; Tempone, Raul

    2016-01-01

    In this paper, we are interested in determining the cumulative distribution function of the sum of generalized Gamma in the setting of rare event simulations. To this end, we present an efficient importance sampling estimator. The main result of this work is the bounded relative property of the proposed estimator. This result is used to accurately estimate the outage probability of multibranch maximum ratio combining and equal gain combining diversity receivers over generalized Gamma fading channels. Selected numerical simulations are discussed to show the robustness of our estimator compared to naive Monte Carlo.

  1. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  2. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  3. A transmission probability method for calculation of neutron flux distributions in hexagonal geometry

    International Nuclear Information System (INIS)

    Wasastjerna, F.; Lux, I.

    1980-03-01

    A transmission probability method implemented in the program TPHEX is described. This program was developed for the calculation of neutron flux distributions in hexagonal light water reactor fuel assemblies. The accuracy appears to be superior to diffusion theory, and the computation time is shorter than that of the collision probability method. (author)

  4. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    Science.gov (United States)

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  5. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  6. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  7. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  8. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    Directory of Open Access Journals (Sweden)

    Kristian Hovde Liland

    2016-01-01

    Full Text Available When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  9. Probabilities of filaments in a Poissonian distribution of points -I

    International Nuclear Information System (INIS)

    Betancort-Rijo, J.

    1989-01-01

    Statistical techniques are devised to assess the likelihood of a Poisson sample of points in two and three dimensions, containing specific filamentary structures. For that purpose, the expression of Otto et al (1986. Astrophys. J., 304) for the probability density of clumps in a Poissonian distribution of points is generalized for any value of the density contrast. A way of counting filaments differing from that of Otto et al. is proposed, because at low density contrast the filaments counted by Otto et al. are distributed in a clumpy fashion, each clump of filaments corresponding to a distinct observed filament. (author)

  10. Cumulative history recorded in the depth distribution of radiocesium in sediments deposited on a sandbar

    International Nuclear Information System (INIS)

    Tanaka, Kazuya; Kondo, Hiroaki; Sakaguchi, Aya; Takahashi, Yoshio

    2015-01-01

    We collected sediments deposited on a sandbar from the surface to 20 cm in depth in the Abukuma River to clarify the history of radiocesium derived from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident. We analyzed the "1"3"7Cs concentration in the sediments from size-fractioned samples as well as bulk samples. The depth distribution of "1"3"7Cs showed the highest concentration in the deepest sediment layer (18–20 cm) studied, which indicates that sediments with a lower "1"3"7Cs concentration were transported and deposited on sediments having a higher "1"3"7Cs concentration. At the same time, the depth distribution suggests a decrease in radioactivity in provenance areas of the sediments. Analysis of the size-fractioned sediments indicated that the three sediment layers at 4–6 cm, 16–18 cm and 18–20 cm intervals had similar size distribution of "1"3"7Cs and grain size composition although the concentration levels of "1"3"7Cs were different according to their bulk concentrations. The size distribution of "1"3"7Cs also supported the possibility that the decrease in "1"3"7Cs concentration in bulk sediments above 18 cm is due to a decrease in the level of radioactivity in the catchment area. A comparison of the size distribution of "1"3"7Cs between the sediment layers above and below 18 cm suggested that the "1"3"7Cs concentration in the transported fine sediment particles decreased more with time than the "1"3"7Cs concentration in the coarse particles, reflecting the selective transport of the finer particles. The results of this study demonstrated that sediment layers deposited on a sandbar retained the cumulative history of the fluvial transport of radiocesium after the FDNPP accident. - Highlights: • We investigated the history of "1"3"7Cs recorded in sediments in the Abukuma River. • "1"3"7Cs concentration was the highest in the deepest sediment layer studied. • The depth distribution suggests a decrease in radioactivity in

  11. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  12. Idealized models of the joint probability distribution of wind speeds

    Science.gov (United States)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  13. Dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses.

    Science.gov (United States)

    Zhang, Xiang; Loda, Justin B; Woodall, William H

    2017-07-20

    For a patient who has survived a surgery, there could be several levels of recovery. Thus, it is reasonable to consider more than two outcomes when monitoring surgical outcome quality. The risk-adjusted cumulative sum (CUSUM) chart based on multiresponses has been developed for monitoring a surgical process with three or more outcomes. However, there is a significant effect of varying risk distributions on the in-control performance of the chart when constant control limits are applied. To overcome this disadvantage, we apply the dynamic probability control limits to the risk-adjusted CUSUM charts for multiresponses. The simulation results demonstrate that the in-control performance of the charts with dynamic probability control limits can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the use of dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses allows each chart to be designed for the corresponding patient sequence of a surgeon or a hospital and therefore does not require estimating or monitoring the patients' risk distribution. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Probability distribution functions of turbulence in seepage-affected alluvial channel

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Anurag; Kumar, Bimlesh, E-mail: anurag.sharma@iitg.ac.in, E-mail: bimk@iitg.ac.in [Department of Civil Engineering, Indian Institute of Technology Guwahati, 781039 (India)

    2017-02-15

    The present experimental study is carried out on the probability distribution functions (PDFs) of turbulent flow characteristics within near-bed-surface and away-from-bed surfaces for both no seepage and seepage flow. Laboratory experiments were conducted in the plane sand bed for no seepage (NS), 10% seepage (10%S) and 15% seepage (15%) cases. The experimental calculation of the PDFs of turbulent parameters such as Reynolds shear stress, velocity fluctuations, and bursting events is compared with theoretical expression obtained by Gram–Charlier (GC)-based exponential distribution. Experimental observations follow the computed PDF distributions for both no seepage and seepage cases. Jensen-Shannon divergence (JSD) method is used to measure the similarity between theoretical and experimental PDFs. The value of JSD for PDFs of velocity fluctuation lies between 0.0005 to 0.003 while the JSD value for PDFs of Reynolds shear stress varies between 0.001 to 0.006. Even with the application of seepage, the PDF distribution of bursting events, sweeps and ejections are well characterized by the exponential distribution of the GC series, except that a slight deflection of inward and outward interactions is observed which may be due to weaker events. The value of JSD for outward and inward interactions ranges from 0.0013 to 0.032, while the JSD value for sweep and ejection events varies between 0.0001 to 0.0025. The theoretical expression for the PDF of turbulent intensity is developed in the present study, which agrees well with the experimental observations and JSD lies between 0.007 and 0.015. The work presented is potentially applicable to the probability distribution of mobile-bed sediments in seepage-affected alluvial channels typically characterized by the various turbulent parameters. The purpose of PDF estimation from experimental data is that it provides a complete numerical description in the areas of turbulent flow either at a single or finite number of points

  15. Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum

    Directory of Open Access Journals (Sweden)

    Farshid eSepehrband

    2016-05-01

    Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.

  16. Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence

    International Nuclear Information System (INIS)

    Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del

    2009-01-01

    Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.

  17. Non-Gaussian probability distributions of solar wind fluctuations

    Directory of Open Access Journals (Sweden)

    E. Marsch

    Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.

  18. Sharing a quota on cumulative carbon emissions

    International Nuclear Information System (INIS)

    Raupach, Michael R.; Davis, Steven J.; Peters, Glen P.; Andrew, Robbie M.; Canadell, Josep G.; Ciais, Philippe

    2014-01-01

    Any limit on future global warming is associated with a quota on cumulative global CO 2 emissions. We translate this global carbon quota to regional and national scales, on a spectrum of sharing principles that extends from continuation of the present distribution of emissions to an equal per-capita distribution of cumulative emissions. A blend of these endpoints emerges as the most viable option. For a carbon quota consistent with a 2 C warming limit (relative to pre-industrial levels), the necessary long-term mitigation rates are very challenging (typically over 5% per year), both because of strong limits on future emissions from the global carbon quota and also the likely short-term persistence in emissions growth in many regions. (authors)

  19. Newdistns: An R Package for New Families of Distributions

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2016-03-01

    Full Text Available The contributed R package Newdistns written by the authors is introduced. This package computes the probability density function, cumulative distribution function, quantile function, random numbers and some measures of inference for nineteen families of distributions. Each family is flexible enough to encompass a large number of structures. The use of the package is illustrated using a real data set. Also robustness of random number generation is checked by simulation.

  20. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    Science.gov (United States)

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  1. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  2. A comparison of alternative methods of calculating complementary cumulative distribution functions of health effects following an atmospheric radioactive release

    International Nuclear Information System (INIS)

    Ponting, A.C.; Nair, S.

    1984-04-01

    A concept extensively used in studying the consequences of accidental atmospheric radioactive releases is that of the Complementary Cumulative Distribution Function, CCDF. Various methods of calculating CCDFs have been developed with particular applications in putting degraded core accidents in perspective and in identifying release sequences leading to high risks. This note compares three methods with specific reference to their accuracy and computational efficiency. For two of the methods (that used in the US Reactor Safety Study code CRAC2 and extended version of that method), the effects of varying the sector width and considering site-specific population distributions have been determined. For the third method it is only necessary to consider the effects of site-specific population distributions. (author)

  3. Extreme points of the convex set of joint probability distributions with ...

    Indian Academy of Sciences (India)

    Here we address the following problem: If G is a standard ... convex set of all joint probability distributions on the product Borel space (X1 ×X2, F1 ⊗. F2) which .... cannot be identically zero when X and Y vary in A1 and u and v vary in H2. Thus.

  4. Flux-probability distributions from the master equation for radiation transport in stochastic media

    International Nuclear Information System (INIS)

    Franke, Brian C.; Prinja, Anil K.

    2011-01-01

    We present numerical investigations into the accuracy of approximations in the master equation for radiation transport in discrete binary random media. Our solutions of the master equation yield probability distributions of particle flux at each element of phase space. We employ the Levermore-Pomraning interface closure and evaluate the effectiveness of closures for the joint conditional flux distribution for estimating scattering integrals. We propose a parameterized model for this joint-pdf closure, varying between correlation neglect and a full-correlation model. The closure is evaluated for a variety of parameter settings. Comparisons are made with benchmark results obtained through suites of fixed-geometry realizations of random media in rod problems. All calculations are performed using Monte Carlo techniques. Accuracy of the approximations in the master equation is assessed by examining the probability distributions for reflection and transmission and by evaluating the moments of the pdfs. The results suggest the correlation-neglect setting in our model performs best and shows improved agreement in the atomic-mix limit. (author)

  5. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  6. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  7. Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2015-09-01

    Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.

  8. Joint probability distributions and fluctuation theorems

    International Nuclear Information System (INIS)

    García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien

    2012-01-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators

  9. THREE-MOMENT BASED APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2014-03-01

    Full Text Available The paper deals with the problem of approximation of probability distributions of random variables defined in positive area of real numbers with coefficient of variation different from unity. While using queueing systems as models for computer networks, calculation of characteristics is usually performed at the level of expectation and variance. At the same time, one of the main characteristics of multimedia data transmission quality in computer networks is delay jitter. For jitter calculation the function of packets time delay distribution should be known. It is shown that changing the third moment of distribution of packets delay leads to jitter calculation difference in tens or hundreds of percent, with the same values of the first two moments – expectation value and delay variation coefficient. This means that delay distribution approximation for the calculation of jitter should be performed in accordance with the third moment of delay distribution. For random variables with coefficients of variation greater than unity, iterative approximation algorithm with hyper-exponential two-phase distribution based on three moments of approximated distribution is offered. It is shown that for random variables with coefficients of variation less than unity, the impact of the third moment of distribution becomes negligible, and for approximation of such distributions Erlang distribution with two first moments should be used. This approach gives the possibility to obtain upper bounds for relevant characteristics, particularly, the upper bound of delay jitter.

  10. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    Gupta, S.S.; Panchapakesan, S.

    1975-01-01

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  11. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  12. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  13. On the issues of probability distribution of GPS carrier phase observations

    Science.gov (United States)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In common practice the observables related to Global Positioning System (GPS) are assumed to follow a Gauss-Laplace normal distribution. Actually, full knowledge of the observables' distribution is not required for parameter estimation by means of the least-squares algorithm based on the functional relation between observations and unknown parameters as well as the associated variance-covariance matrix. However, the probability distribution of GPS observations plays a key role in procedures for quality control (e.g. outlier and cycle slips detection, ambiguity resolution) and in reliability-related assessments of the estimation results. Under non-ideal observation conditions with respect to the factors impacting GPS data quality, for example multipath effects and atmospheric delays, the validity of the normal distribution postulate of GPS observations is in doubt. This paper presents a detailed analysis of the distribution properties of GPS carrier phase observations using double difference residuals. For this purpose 1-Hz observation data from the permanent SAPOS

  14. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    International Nuclear Information System (INIS)

    Xu, Xin-Ping; Ide, Yusuke

    2016-01-01

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  15. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xin-Ping, E-mail: xuxp@mail.ihep.ac.cn [School of Physical Science and Technology, Soochow University, Suzhou 215006 (China); Ide, Yusuke [Department of Information Systems Creation, Faculty of Engineering, Kanagawa University, Yokohama, Kanagawa, 221-8686 (Japan)

    2016-10-15

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  16. On the symmetric α-stable distribution with application to symbol error rate calculations

    KAUST Repository

    Soury, Hamza

    2016-12-24

    The probability density function (PDF) of the symmetric α-stable distribution is investigated using the inverse Fourier transform of its characteristic function. For general values of the stable parameter α, it is shown that the PDF and the cumulative distribution function of the symmetric stable distribution can be expressed in terms of the Fox H function as closed-form. As an application, the probability of error of single input single output communication systems using different modulation schemes with an α-stable perturbation is studied. In more details, a generic formula is derived for generalized fading distribution, such as the extended generalized-k distribution. Later, simpler expressions of these error rates are deduced for some selected special cases and compact approximations are derived using asymptotic expansions.

  17. Examining barrier distributions and, in extension, energy derivative of probabilities for surrogate experiments

    International Nuclear Information System (INIS)

    Romain, P.; Duarte, H.; Morillon, B.

    2012-01-01

    The energy derivatives of probabilities are functions suited to a best understanding of certain mechanisms. Applied to compound nuclear reactions, they can bring information on fusion barrier distributions as originally introduced, and also, as presented here, on fission barrier distributions and heights. Extendedly, they permit to access the compound nucleus spin-parity states preferentially populated according to an entrance channel, at a given energy. (authors)

  18. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  19. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    Science.gov (United States)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  20. Correlator bank detection of gravitational wave chirps--False-alarm probability, template density, and thresholds: Behind and beyond the minimal-match issue

    International Nuclear Information System (INIS)

    Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.

    2004-01-01

    The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness

  1. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  2. A Bernstein-Von Mises Theorem for discrete probability distributions

    OpenAIRE

    Boucheron, S.; Gassiat, E.

    2008-01-01

    We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function θ0 on ℕ∖{0} and a sequence of truncation levels (kn)n satisfying kn3≤ninf i≤knθ0(i). Let θ̂ denote the maximum likelihood estimate of (θ0(i))i≤kn and let Δn(θ0) denote the kn-dimensional vector which i-th coordinate is defined by $\\sqrt{n}(\\hat{\\theta}_{n}(i)-\\theta_{0}(i))$ for 1≤i≤kn. We check that under mild ...

  3. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  4. MORSEC-SP, Step Function Angular Distribution for Cross-Sections Calculation by Program MORSE

    International Nuclear Information System (INIS)

    1980-01-01

    1 - Description of problem or function: MORSEC-SP allows one to utilize a step distribution to describe the angular dependence of the multi- group function in the MORSEC cross section module of the MORSE Monte Carlo code. The step distribution is always non-negative and may be used in the random walk and for making point detector estimators. 2 - Method of solution: MORSEC-SP utilizes a table look up procedure to provide the probability of scattering when making point detector estimates for a given incident energy group and scattering angle. In the random walk, the step distributions are converted to cumulative distributions and an angle of scatter is selected from the cumulative distributions. Step distributions are obtained from calculation using the converted moments from the given Legendre coefficients of the scattering distributions. 3 - Restrictions on the complexity of the problem: Additional coding to the MORSEC module is variable dimensional and fully incorporated into blank common

  5. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  6. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically......We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...... adjusted according to past claims experience....

  7. Probability distribution functions for intermittent scrape-off layer plasma fluctuations

    Science.gov (United States)

    Theodorsen, A.; Garcia, O. E.

    2018-03-01

    A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.

  8. Radionuclide and colloid transport in the Culebra Dolomite and associated complementary cumulative distribution functions in the 1996 performance assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    RAMSEY, JAMES L.; BLAINE,R.; GARNER,J.W.; HELTON,JON CRAIG; JOHNSON,J.D.; SMITH,L.N.; WALLACE,M.

    2000-05-22

    The following topics related to radionuclide and colloid transport in the Culebra Dolomite in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented: (1) mathematical description of models, (2) uncertainty and sensitivity analysis results arising from subjective (i.e., epistemic) uncertainty for individual releases, and (3) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e., aleatory) uncertainty. The presented results indicate that radionuclide and colloid transport in the Culebra Dolomite does not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, no radionuclide transport to the boundary with the accessible environment was observed; thus the associated CCDFs for comparison with the boundary line specified in the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, 40 CFR 194) are degenerate in the sense of having a probability of zero of exceeding a release of zero.

  9. Audio feature extraction using probability distribution function

    Science.gov (United States)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  10. Rate coefficients from quantum and quasi-classical cumulative reaction probabilities for the S(1D) + H2 reaction

    Science.gov (United States)

    Jambrina, P. G.; Lara, Manuel; Menéndez, M.; Launay, J.-M.; Aoiz, F. J.

    2012-10-01

    Cumulative reaction probabilities (CRPs) at various total angular momenta have been calculated for the barrierless reaction S(1D) + H2 → SH + H at total energies up to 1.2 eV using three different theoretical approaches: time-independent quantum mechanics (QM), quasiclassical trajectories (QCT), and statistical quasiclassical trajectories (SQCT). The calculations have been carried out on the widely used potential energy surface (PES) by Ho et al. [J. Chem. Phys. 116, 4124 (2002), 10.1063/1.1431280] as well as on the recent PES developed by Song et al. [J. Phys. Chem. A 113, 9213 (2009), 10.1021/jp903790h]. The results show that the differences between these two PES are relatively minor and mostly related to the different topologies of the well. In addition, the agreement between the three theoretical methodologies is good, even for the highest total angular momenta and energies. In particular, the good accordance between the CRPs obtained with dynamical methods (QM and QCT) and the statistical model (SQCT) indicates that the reaction can be considered statistical in the whole range of energies in contrast with the findings for other prototypical barrierless reactions. In addition, total CRPs and rate coefficients in the range of 20-1000 K have been calculated using the QCT and SQCT methods and have been found somewhat smaller than the experimental total removal rates of S(1D).

  11. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  12. Multivariate η-μ fading distribution with arbitrary correlation model

    Science.gov (United States)

    Ghareeb, Ibrahim; Atiani, Amani

    2018-03-01

    An extensive analysis for the multivariate ? distribution with arbitrary correlation is presented, where novel analytical expressions for the multivariate probability density function, cumulative distribution function and moment generating function (MGF) of arbitrarily correlated and not necessarily identically distributed ? power random variables are derived. Also, this paper provides exact-form expression for the MGF of the instantaneous signal-to-noise ratio at the combiner output in a diversity reception system with maximal-ratio combining and post-detection equal-gain combining operating in slow frequency nonselective arbitrarily correlated not necessarily identically distributed ?-fading channels. The average bit error probability of differentially detected quadrature phase shift keying signals with post-detection diversity reception system over arbitrarily correlated and not necessarily identical fading parameters ?-fading channels is determined by using the MGF-based approach. The effect of fading correlation between diversity branches, fading severity parameters and diversity level is studied.

  13. PHOTOMETRIC REDSHIFT PROBABILITY DISTRIBUTIONS FOR GALAXIES IN THE SDSS DR8

    International Nuclear Information System (INIS)

    Sheldon, Erin S.; Cunha, Carlos E.; Mandelbaum, Rachel; Brinkmann, J.; Weaver, Benjamin A.

    2012-01-01

    We present redshift probability distributions for galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 8 imaging data. We used the nearest-neighbor weighting algorithm to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8 and u < 29.0. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five-dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training-set redshifts. We derived P(z)'s for individual objects by using training-set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy can reduce the statistical error in measurements that depend on the redshifts of individual galaxies. The spectroscopic training sample is substantially larger than that used for the DR7 release. The newly added PRIMUS catalog is now the most important training set used in this analysis by a wide margin. We expect the primary sources of error in the N(z) reconstruction to be sample variance and spectroscopic failures: The training sets are drawn from relatively small volumes of space, and some samples have large incompleteness. Using simulations we estimated the uncertainty in N(z) due to sample variance at a given redshift to be ∼10%-15%. The uncertainty on calculations incorporating N(z) or P(z) depends on how they are used; we discuss the case of weak lensing measurements. The P(z) catalog is publicly available from the SDSS Web site.

  14. Outage probability of dual-hop partial relay selection with feedback delay in the presence of interference

    KAUST Repository

    Al-Qahtani, Fawaz S.

    2011-09-01

    In this paper, we investigate the outage performance of a dual-hop relaying systems with partial relay selection and feedback delay. The analysis considers the case of Rayleigh fading channels when the relaying station as well as the destination undergo mutually independent interfering signals. Particularly, we derive the cumulative distribution function (c.d.f.) of a new type of random variable involving sum of multiple independent exponential random variables, based on which, we present closed-form expressions for the exact outage probability of a fixed amplify-and-forward (AF) and decode-and-forward (DF) relaying protocols. Numerical results are provided to illustrate the joint effect of the delayed feedback and co-channel interference on the outage probability. © 2011 IEEE.

  15. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  16. The limiting conditional probability distribution in a stochastic model of T cell repertoire maintenance.

    Science.gov (United States)

    Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen

    2010-04-01

    The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.

  17. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  18. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  20. Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation

    Energy Technology Data Exchange (ETDEWEB)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    2018-01-01

    We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.

  1. PROBCON-HDW: A probability and consequence system of codes for long-term analysis of Hanford defense wastes

    International Nuclear Information System (INIS)

    Piepho, M.G.; Nguyen, T.H.

    1988-12-01

    The PROBCON-HDW (PROBability and CONsequence analysis for Hanford defense waste) computer code system calculates the long-term cumulative releases of radionuclides from the Hanford defense wastes (HDW) to the accessible environment and compares the releases to environmental release limits as defined in 40 CFR 191. PROBCON-HDW takes into account the variability of input parameter values used in models to calculate HDW release and transport in the vadose zone to the accessible environment (taken here as groundwater). A human intrusion scenario, which consists of drilling boreholes into the waste beneath the waste sites and bringing waste to the surface, is also included in PROBCON-HDW. PROBCON-HDW also includes the capability to combine the cumulative releases according to various long-term (10,000 year) scenarios into a composite risk curve or complementary cumulative distribution function (CCDF). The system structure of the PROBCON-HDW codes, the mathematical models in PROBCON-HDW, the input files, the input formats, the command files, and the graphical output results of several HDW release scenarios are described in the report. 3 refs., 7 figs., 9 tabs

  2. Impact of spike train autostructure on probability distribution of joint spike events.

    Science.gov (United States)

    Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl

    2013-05-01

    The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing.

  3. Probability Distribution and Projected Trends of Daily Precipitation in China

    Institute of Scientific and Technical Information of China (English)

    CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER

    2013-01-01

    Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.

  4. Fusing probability density function into Dempster-Shafer theory of evidence for the evaluation of water treatment plant.

    Science.gov (United States)

    Chowdhury, Shakhawat

    2013-05-01

    The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.

  5. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    Science.gov (United States)

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  6. Scarred resonances and steady probability distribution in a chaotic microcavity

    International Nuclear Information System (INIS)

    Lee, Soo-Young; Rim, Sunghwan; Kim, Chil-Min; Ryu, Jung-Wan; Kwon, Tae-Yoon

    2005-01-01

    We investigate scarred resonances of a stadium-shaped chaotic microcavity. It is shown that two components with different chirality of the scarring pattern are slightly rotated in opposite ways from the underlying unstable periodic orbit, when the incident angles of the scarring pattern are close to the critical angle for total internal reflection. In addition, the correspondence of emission pattern with the scarring pattern disappears when the incident angles are much larger than the critical angle. The steady probability distribution gives a consistent explanation about these interesting phenomena and makes it possible to expect the emission pattern in the latter case

  7. Loaded dice in Monte Carlo : importance sampling in phase space integration and probability distributions for discrepancies

    NARCIS (Netherlands)

    Hameren, Andreas Ferdinand Willem van

    2001-01-01

    Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of

  8. A probabilistic analysis of cumulative carbon emissions and long-term planetary warming

    International Nuclear Information System (INIS)

    Fyke, Jeremy; Matthews, H Damon

    2015-01-01

    Efforts to mitigate and adapt to long-term climate change could benefit greatly from probabilistic estimates of cumulative carbon emissions due to fossil fuel burning and resulting CO 2 -induced planetary warming. Here we demonstrate the use of a reduced-form model to project these variables. We performed simulations using a large-ensemble framework with parametric uncertainty sampled to produce distributions of future cumulative emissions and consequent planetary warming. A hind-cast ensemble of simulations captured 1980–2012 historical CO 2 emissions trends and an ensemble of future projection simulations generated a distribution of emission scenarios that qualitatively resembled the suite of Representative and Extended Concentration Pathways. The resulting cumulative carbon emission and temperature change distributions are characterized by 5–95th percentile ranges of 0.96–4.9 teratonnes C (Tt C) and 1.4 °C–8.5 °C, respectively, with 50th percentiles at 3.1 Tt C and 4.7 °C. Within the wide range of policy-related parameter combinations that produced these distributions, we found that low-emission simulations were characterized by both high carbon prices and low costs of non-fossil fuel energy sources, suggesting the importance of these two policy levers in particular for avoiding dangerous levels of climate warming. With this analysis we demonstrate a probabilistic approach to the challenge of identifying strategies for limiting cumulative carbon emissions and assessing likelihoods of surpassing dangerous temperature thresholds. (letter)

  9. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  10. Effects of the financial crisis on the wealth distribution of Korea's companies

    Science.gov (United States)

    Lim, Kyuseong; Kim, Soo Yong; Swanson, Todd; Kim, Jooyun

    2017-02-01

    We investigated the distribution functions of Korea's top-rated companies during two financial crises. A power-law scaling for rank distribution, as well as cumulative probability distribution, was found and observed as a general pattern. Similar distributions can be shown in other studies of wealth and income distributions. In our study, the Pareto exponents designating the distribution differed before and after the crisis. The companies covered in this research are divided into two subgroups during a period when the subprime mortgage crisis occurred. Various industrial sectors of Korea's companies were found to respond differently during the two financial crises, especially the construction sector, financial sectors, and insurance groups.

  11. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  12. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Energy Technology Data Exchange (ETDEWEB)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B [Laboratorio de Ingenieria de Rehabilitacion e Investigaciones Neuromusculares y Sensoriales, Facultad de Ingenieria, UNER, Oro Verde (Argentina)

    2007-11-15

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  13. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    International Nuclear Information System (INIS)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B

    2007-01-01

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed

  14. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    Science.gov (United States)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  15. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available maintaining constraints in a DC-DC converter is investigated. A probability distribution whose aim is to ensure maximal harmonic spreading and yet mainaint constraints is presented. The PDFs are determined from a direct application of the method of Maximum...

  16. Degree distribution in discrete case

    International Nuclear Information System (INIS)

    Wang, Li-Na; Chen, Bin; Yan, Zai-Zai

    2011-01-01

    Vertex degree of many network models and real-life networks is limited to non-negative integer. By means of measure and integral, the relation of the degree distribution and the cumulative degree distribution in discrete case is analyzed. The degree distribution, obtained by the differential of its cumulative, is only suitable for continuous case or discrete case with constant degree change. When degree change is not a constant but proportional to degree itself, power-law degree distribution and its cumulative have the same exponent and the mean value is finite for power-law exponent greater than 1. -- Highlights: → Degree change is the crux for using the cumulative degree distribution method. → It suits for discrete case with constant degree change. → If degree change is proportional to degree, power-law degree distribution and its cumulative have the same exponent. → In addition, the mean value is finite for power-law exponent greater than 1.

  17. A mixture of exponentials distribution for a simple and precise assessment of the volcanic hazard

    Directory of Open Access Journals (Sweden)

    A. T. Mendoza-Rosas

    2009-03-01

    Full Text Available The assessment of volcanic hazard is the first step for disaster mitigation. The distribution of repose periods between eruptions provides important information about the probability of new eruptions occurring within given time intervals. The quality of the probability estimate, i.e., of the hazard assessment, depends on the capacity of the chosen statistical model to describe the actual distribution of the repose times. In this work, we use a mixture of exponentials distribution, namely the sum of exponential distributions characterized by the different eruption occurrence rates that may be recognized inspecting the cumulative number of eruptions with time in specific VEI (Volcanic Explosivity Index categories. The most striking property of an exponential mixture density is that the shape of the density function is flexible in a way similar to the frequently used Weibull distribution, matching long-tailed distributions and allowing clustering and time dependence of the eruption sequence, with distribution parameters that can be readily obtained from the observed occurrence rates. Thus, the mixture of exponentials turns out to be more precise and much easier to apply than the Weibull distribution. We recommended the use of a mixture of exponentials distribution when regimes with well-defined eruption rates can be identified in the cumulative series of events. As an example, we apply the mixture of exponential distributions to the repose-time sequences between explosive eruptions of the Colima and Popocatépetl volcanoes, México, and compare the results obtained with the Weibull and other distributions.

  18. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  19. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1990-12-01

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  20. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M S

    1990-12-15

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  1. On the probability distribution of daily streamflow in the United States

    Science.gov (United States)

    Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.

    2017-06-01

    Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.

  2. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  3. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks

    International Nuclear Information System (INIS)

    Zhuang Jiancang; Ogata, Yosihiko

    2006-01-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata et al., Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method

  4. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    Science.gov (United States)

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  5. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which

  6. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  7. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  8. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  9. Optimization of radiation therapy, III: a method of assessing complication probabilities from dose-volume histograms

    International Nuclear Information System (INIS)

    Lyman, J.T.; Wolbarst, A.B.

    1987-01-01

    To predict the likelihood of success of a therapeutic strategy, one must be able to assess the effects of the treatment upon both diseased and healthy tissues. This paper proposes a method for determining the probability that a healthy organ that receives a non-uniform distribution of X-irradiation, heat, chemotherapy, or other agent will escape complications. Starting with any given dose distribution, a dose-cumulative-volume histogram for the organ is generated. This is then reduced by an interpolation scheme (involving the volume-weighting of complication probabilities) to a slightly different histogram that corresponds to the same overall likelihood of complications, but which contains one less step. The procedure is repeated, one step at a time, until there remains a final, single-step histogram, for which the complication probability can be determined. The formalism makes use of a complication response function C(D, V) which, for the given treatment schedule, represents the probability of complications arising when the fraction V of the organ receives dose D and the rest of the organ gets none. Although the data required to generate this function are sparse at present, it should be possible to obtain the necessary information from in vivo and clinical studies. Volume effects are taken explicitly into account in two ways: the precise shape of the patient's histogram is employed in the calculation, and the complication response function is a function of the volume

  10. Research on Energy-Saving Design of Overhead Travelling Crane Camber Based on Probability Load Distribution

    Directory of Open Access Journals (Sweden)

    Tong Yifei

    2014-01-01

    Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.

  11. On Farmer's line, probability density functions, and overall risk

    International Nuclear Information System (INIS)

    Munera, H.A.; Yadigaroglu, G.

    1986-01-01

    Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value

  12. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  14. Various models for pion probability distributions from heavy-ion collisions

    International Nuclear Information System (INIS)

    Mekjian, A.Z.; Mekjian, A.Z.; Schlei, B.R.; Strottman, D.; Schlei, B.R.

    1998-01-01

    Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength η of a Poisson emitter and a critical density η c are connected in a thermal model by η/n c =e -m/T <1, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can be much larger than Poisson in the pion laser model and for a negative binomial description. The clan representation of the negative binomial distribution due to Van Hove and Giovannini is discussed using the present description. Applications to CERN/NA44 and CERN/NA49 data are discussed in terms of the relativistic hydrodynamic model. copyright 1998 The American Physical Society

  15. Analytical models of probability distribution and excess noise factor of solid state photomultiplier signals with crosstalk

    International Nuclear Information System (INIS)

    Vinogradov, S.

    2012-01-01

    Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.

  16. Prediction of the cumulated dose for external beam irradiation of prostate cancer patients with 3D-CRT technique

    Directory of Open Access Journals (Sweden)

    Giżyńska Marta

    2016-03-01

    Full Text Available Nowadays in radiotherapy, much effort is taken to minimize the irradiated volume and consequently minimize doses to healthy tissues. In our work, we tested the hypothesis that the mean dose distribution calculated from a few first fractions can serve as prediction of the cumulated dose distribution, representing the whole treatment. We made our tests for 25 prostate cancer patients treated with three orthogonal fields technique. We did a comparison of dose distribution calculated as a sum of dose distribution from each fraction with a dose distribution calculated with isocenter shifted for a mean setup error from a few first fractions. The cumulative dose distribution and predicted dose distributions are similar in terms of gamma (3 mm 3% analysis, under condition that we know setup error from seven first fractions. We showed that the dose distribution calculated for the original plan with the isocenter shifted to the point, defined as the original isocenter corrected of the mean setup error estimated from the first seven fractions supports our hypothesis, i.e. can serve as a prediction for cumulative dose distribution.

  17. The probability distribution of maintenance cost of a system affected by the gamma process of degradation: Finite time solution

    International Nuclear Information System (INIS)

    Cheng, Tianjin; Pandey, Mahesh D.; Weide, J.A.M. van der

    2012-01-01

    The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.

  18. Measuring sensitivity in pharmacoeconomic studies. Refining point sensitivity and range sensitivity by incorporating probability distributions.

    Science.gov (United States)

    Nuijten, M J

    1999-07-01

    The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.

  19. Morphology of the cumulative logistic distribution when used as a model of radiologic film characteristic curves

    International Nuclear Information System (INIS)

    Prince, J.R.

    1988-01-01

    The cumulative logistic distribution (CLD) is an empiric model for film characteristic curves. Characterizing the shape parameters of the CLD in terms of contrast, latitude and speed is required. The CLD is written as Υ-F=D/[1+EXP-(Κ+κ 1 X)] where Υ is the optical density (OD) at log exposure X, F is fog level, D is a constant equal to Dm-F, Κ and κ 1 are shape parameters, and Dm is the maximum attainable OD. Further analysis demonstrates that when Κ is held constant, Κ 1 characterizes contrast (the larger κ 1 , the greater the contrast) and hence latitude; when κ 1 is held constant, Κ characterizes film speed (the larger Κ is, the faster the film). These equations and concepts are further illustrated with examples from radioscintigraphy, diagnostic radiology, and light sensitometry

  20. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  2. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  3. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  4. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  5. Evaluating the suitability of wind speed probability distribution models: A case of study of east and southeast parts of Iran

    International Nuclear Information System (INIS)

    Alavi, Omid; Mohammadi, Kasra; Mostafaeipour, Ali

    2016-01-01

    Highlights: • Suitability of different wind speed probability functions is assessed. • 5 stations distributed in east and south-east of Iran are considered as case studies. • Nakagami distribution is tested for first time and compared with 7 other functions. • Due to difference in wind features, best function is not similar for all stations. - Abstract: Precise information of wind speed probability distribution is truly significant for many wind energy applications. The objective of this study is to evaluate the suitability of different probability functions for estimating wind speed distribution at five stations, distributed in the east and southeast of Iran. Nakagami distribution function is utilized for the first time to estimate the distribution of wind speed. The performance of Nakagami function is compared with seven typically used distribution functions. The achieved results reveal that the more effective function is not similar among all stations. Wind speed characteristics, quantity and quality of the recorded wind speed data can be considered as influential parameters on the performance of the distribution functions. Also, the skewness of the recorded wind speed data may have influence on the accuracy of the Nakagami distribution. For Chabahar and Khaf stations the Nakagami distribution shows the highest performance while for Lutak, Rafsanjan and Zabol stations the Gamma, Generalized Extreme Value and Inverse-Gaussian distributions offer the best fits, respectively. Based on the analysis, the Nakagami distribution can generally be considered as an effective distribution since it provides the best fits in 2 stations and ranks 3rd to 5th in the remaining stations; however, due to the close performance of the Nakagami and Weibull distributions and also flexibility of the Weibull function as its widely proven feature, more assessments on the performance of the Nakagami distribution are required.

  6. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  7. New method for extracting tumors in PET/CT images based on the probability distribution

    International Nuclear Information System (INIS)

    Nitta, Shuhei; Hontani, Hidekata; Hukami, Tadanori

    2006-01-01

    In this report, we propose a method for extracting tumors from PET/CT images by referring to the probability distribution of pixel values in the PET image. In the proposed method, first, the organs that normally take up fluorodeoxyglucose (FDG) (e.g., the liver, kidneys, and brain) are extracted. Then, the tumors are extracted from the images. The distribution of pixel values in PET images differs in each region of the body. Therefore, the threshold for detecting tumors is adaptively determined by referring to the distribution. We applied the proposed method to 37 cases and evaluated its performance. This report also presents the results of experiments comparing the proposed method and another method in which the pixel values are normalized for extracting tumors. (author)

  8. A comparison of the probability distribution of observed substorm magnitude with that predicted by a minimal substorm model

    Directory of Open Access Journals (Sweden)

    S. K. Morley

    2007-11-01

    Full Text Available We compare the probability distributions of substorm magnetic bay magnitudes from observations and a minimal substorm model. The observed distribution was derived previously and independently using the IL index from the IMAGE magnetometer network. The model distribution is derived from a synthetic AL index time series created using real solar wind data and a minimal substorm model, which was previously shown to reproduce observed substorm waiting times. There are two free parameters in the model which scale the contributions to AL from the directly-driven DP2 electrojet and loading-unloading DP1 electrojet, respectively. In a limited region of the 2-D parameter space of the model, the probability distribution of modelled substorm bay magnitudes is not significantly different to the observed distribution. The ranges of the two parameters giving acceptable (95% confidence level agreement are consistent with expectations using results from other studies. The approximately linear relationship between the two free parameters over these ranges implies that the substorm magnitude simply scales linearly with the solar wind power input at the time of substorm onset.

  9. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population

    OpenAIRE

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A.; Ono, Yutaka

    2016-01-01

    [Background]Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item th...

  10. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham

    2017-04-07

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.

  11. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    International Nuclear Information System (INIS)

    Viana, R.S.; Yoriyaz, H.; Santos, A.

    2011-01-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  12. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Viana, R.S.; Yoriyaz, H.; Santos, A., E-mail: rodrigossviana@gmail.com, E-mail: hyoriyaz@ipen.br, E-mail: asantos@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  13. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    Science.gov (United States)

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  14. The probability distribution of extreme precipitation

    Science.gov (United States)

    Korolev, V. Yu.; Gorshenin, A. K.

    2017-12-01

    On the basis of the negative binomial distribution of the duration of wet periods calculated per day, an asymptotic model is proposed for distributing the maximum daily rainfall volume during the wet period, having the form of a mixture of Frechet distributions and coinciding with the distribution of the positive degree of a random variable having the Fisher-Snedecor distribution. The method of proving the corresponding result is based on limit theorems for extreme order statistics in samples of a random volume with a mixed Poisson distribution. The adequacy of the models proposed and methods of their statistical analysis is demonstrated by the example of estimating the extreme distribution parameters based on real data.

  15. Snow-melt flood frequency analysis by means of copula based 2D probability distributions for the Narew River in Poland

    Directory of Open Access Journals (Sweden)

    Bogdan Ozga-Zielinski

    2016-06-01

    New hydrological insights for the region: The results indicated that the 2D normal probability distribution model gives a better probabilistic description of snowmelt floods characterized by the 2-dimensional random variable (Qmax,f, Vf compared to the elliptical Gaussian copula and Archimedean 1-parameter Gumbel–Hougaard copula models, in particular from the view point of probability of exceedance as well as complexity and time of computation. Nevertheless, the copula approach offers a new perspective in estimating the 2D probability distribution for multidimensional random variables. Results showed that the 2D model for snowmelt floods built using the Gumbel–Hougaard copula is much better than the model built using the Gaussian copula.

  16. Exact Outage Probability of Dual-Hop CSI-Assisted AF Relaying Over Nakagami-m Fading Channels

    KAUST Repository

    Xia, Minghua

    2012-10-01

    In this correspondence, considering dual-hop channel state information (CSI)-assisted amplify-and-forward (AF) relaying over Nakagami- m fading channels, the cumulative distribution function (CDF) of the end-to-end signal-to-noise ratio (SNR) is derived. In particular, when the fading shape factors m1 and m2 at consecutive hops take non-integer values, the bivariate H-function and G -function are exploited to obtain an exact analytical expression for the CDF. The obtained CDF is then applied to evaluate the outage performance of the system under study. The analytical results of outage probability coincide exactly with Monte-Carlo simulation results and outperform the previously reported upper bounds in the low and medium SNR regions.

  17. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    Science.gov (United States)

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  18. Adaptive strategies for cumulative cultural learning.

    Science.gov (United States)

    Ehn, Micael; Laland, Kevin

    2012-05-21

    The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Uncertainty of Hydrological Drought Characteristics with Copula Functions and Probability Distributions: A Case Study of Weihe River, China

    Directory of Open Access Journals (Sweden)

    Panpan Zhao

    2017-05-01

    Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.

  20. Probability distribution of dose rates in the body tissue as a function of the rhytm of Sr90 administration and the age of animals

    International Nuclear Information System (INIS)

    Rasin, I.M.; Sarapul'tsev, I.A.

    1975-01-01

    The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law

  1. Assessing the Adequacy of Probability Distributions for Estimating the Extreme Events of Air Temperature in Dabaa Region

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2015-01-01

    Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October

  2. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    International Nuclear Information System (INIS)

    Caleyo, F.; Velazquez, J.C.; Valor, A.; Hallen, J.M.

    2009-01-01

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  3. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  4. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  5. Matrix-exponential distributions in applied probability

    CERN Document Server

    Bladt, Mogens

    2017-01-01

    This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...

  6. Cumulative effective dose associated with radiography and CT of adolescents with spinal injuries.

    Science.gov (United States)

    Lemburg, Stefan P; Peters, Soeren A; Roggenland, Daniela; Nicolas, Volkmar; Heyer, Christoph M

    2010-12-01

    The purpose of this study was to analyze the quantity and distribution of cumulative effective doses in diagnostic imaging of adolescents with spinal injuries. At a level 1 trauma center from July 2003 through June 2009, imaging procedures during initial evaluation and hospitalization and after discharge of all patients 10-20 years old with spinal fractures were retrospectively analyzed. The cumulative effective doses for all imaging studies were calculated, and the doses to patients with spinal injuries who had multiple traumatic injuries were compared with the doses to patients with spinal injuries but without multiple injuries. The significance level was set at 5%. Imaging studies of 72 patients (32 with multiple injuries; average age, 17.5 years) entailed a median cumulative effective dose of 18.89 mSv. Patients with multiple injuries had a significantly higher total cumulative effective dose (29.70 versus 10.86 mSv, p cumulative effective dose to multiple injury patients during the initial evaluation (18.39 versus 2.83 mSv, p cumulative effective dose. Adolescents with spinal injuries receive a cumulative effective dose equal to that of adult trauma patients and nearly three times that of pediatric trauma patients. Areas of focus in lowering cumulative effective dose should be appropriate initial estimation of trauma severity and careful selection of CT scan parameters.

  7. Cumulative effects of climate and landscape change drive spatial distribution of Rocky Mountain wolverine (Gulo gulo L.).

    Science.gov (United States)

    Heim, Nicole; Fisher, Jason T; Clevenger, Anthony; Paczkowski, John; Volpe, John

    2017-11-01

    Contemporary landscapes are subject to a multitude of human-derived stressors. Effects of such stressors are increasingly realized by population declines and large-scale extirpation of taxa worldwide. Most notably, cumulative effects of climate and landscape change can limit species' local adaptation and dispersal capabilities, thereby reducing realized niche space and range extent. Resolving the cumulative effects of multiple stressors on species persistence is a pressing challenge in ecology, especially for declining species. For example, wolverines ( Gulo gulo L.) persist on only 40% of their historic North American range. While climate change has been shown to be a mechanism of range retractions, anthropogenic landscape disturbance has been recently implicated. We hypothesized these two interact to effect declines. We surveyed wolverine occurrence using camera trapping and genetic tagging at 104 sites at the wolverine range edge, spanning a 15,000 km 2 gradient of climate, topographic, anthropogenic, and biotic variables. We used occupancy and generalized linear models to disentangle the factors explaining wolverine distribution. Persistent spring snow pack-expected to decrease with climate change-was a significant predictor, but so was anthropogenic landscape change. Canid mesocarnivores, which we hypothesize are competitors supported by anthropogenic landscape change, had comparatively weaker effect. Wolverine population declines and range shifts likely result from climate change and landscape change operating in tandem. We contend that similar results are likely for many species and that research that simultaneously examines climate change, landscape change, and the biotic landscape is warranted. Ecology research and species conservation plans that address these interactions are more likely to meet their objectives.

  8. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  9. Cumulative incidence, distribution, and determinants of catastrophic health expenditure in Nepal: results from the living standards survey.

    Science.gov (United States)

    Ghimire, Mamata; Ayer, Rakesh; Kondo, Masahide

    2018-02-14

    Nepal has committed to the global community to achieve universal health coverage by 2030. Nevertheless, Nepal still has a high proportion of out-of-pocket health payment and a limited risk-pooling mechanism. Out-of-pocket payment for the healthcare services could result in catastrophic health expenditure (CHE). Evidence is required to effectively channel the efforts to lower those expenses in order to achieve universal health coverage. However, little is known about CHE and its determinants in a broad national context in Nepal. Therefore, this study was conducted to explore the cumulative incidence, distribution, and determinants of CHE in Nepal. Data were obtained from the nationally representative survey, the Nepal Living Standards Survey-third undertaken in 2010/11. Information from 5988 households was used for the analyses. Households were classified as having CHE when their out-of-pocket health payment was greater than or equal to 40% of their capacity to pay. Remaining households were classified as not having CHE. Logistic regression analyses were used to identify determinants of CHE. Based on household-weighted sample, the cumulative incidence of CHE was 10.3% per month in Nepal. This incidence was concentrated in the far-western region and households in the poorer expenditure quartiles. Multivariable logistic regression revealed that households were more likely to face CHE if they; consisted of chronically ill member(s), have a higher burden of acute illness and injuries, have elderly (≥60 years) member(s), belonged to the poor expenditure quartile, and were located in the far-western region. In contrast, households were less likely to incur CHE when their household head was educated. Having children (≤5 years) in households did not significantly affect catastrophic health expenditure. This study identified a high cumulative incidence of CHE. CHE was disproportionately concentrated in the poor households and households located in the far

  10. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  11. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... of the station’s infrastructure layout and plan of operation. However, these three methods do not take the timetable at the station into consideration. Therefore, two methods are introduced in this paper, making it possible to estimate the robustness of different timetables at a station or different...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time...

  12. Description of atomic burials in compact globular proteins by Fermi-Dirac probability distributions.

    Science.gov (United States)

    Gomes, Antonio L C; de Rezende, Júlia R; Pereira de Araújo, Antônio F; Shakhnovich, Eugene I

    2007-02-01

    We perform a statistical analysis of atomic distributions as a function of the distance R from the molecular geometrical center in a nonredundant set of compact globular proteins. The number of atoms increases quadratically for small R, indicating a constant average density inside the core, reaches a maximum at a size-dependent distance R(max), and falls rapidly for larger R. The empirical curves turn out to be consistent with the volume increase of spherical concentric solid shells and a Fermi-Dirac distribution in which the distance R plays the role of an effective atomic energy epsilon(R) = R. The effective chemical potential mu governing the distribution increases with the number of residues, reflecting the size of the protein globule, while the temperature parameter beta decreases. Interestingly, betamu is not as strongly dependent on protein size and appears to be tuned to maintain approximately half of the atoms in the high density interior and the other half in the exterior region of rapidly decreasing density. A normalized size-independent distribution was obtained for the atomic probability as a function of the reduced distance, r = R/R(g), where R(g) is the radius of gyration. The global normalized Fermi distribution, F(r), can be reasonably decomposed in Fermi-like subdistributions for different atomic types tau, F(tau)(r), with Sigma(tau)F(tau)(r) = F(r), which depend on two additional parameters mu(tau) and h(tau). The chemical potential mu(tau) affects a scaling prefactor and depends on the overall frequency of the corresponding atomic type, while the maximum position of the subdistribution is determined by h(tau), which appears in a type-dependent atomic effective energy, epsilon(tau)(r) = h(tau)r, and is strongly correlated to available hydrophobicity scales. Better adjustments are obtained when the effective energy is not assumed to be necessarily linear, or epsilon(tau)*(r) = h(tau)*r(alpha,), in which case a correlation with hydrophobicity

  13. Stochastic Economic Dispatch with Wind using Versatile Probability Distribution and L-BFGS-B Based Dual Decomposition

    DEFF Research Database (Denmark)

    Huang, Shaojun; Sun, Yuanzhang; Wu, Qiuwei

    2018-01-01

    This paper focuses on economic dispatch (ED) in power systems with intermittent wind power, which is a very critical issue in future power systems. A stochastic ED problem is formed based on the recently proposed versatile probability distribution (VPD) of wind power. The problem is then analyzed...

  14. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    Science.gov (United States)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  15. Analyses of moments in pseudorapidity intervals at √s = 546 GeV by means of two probability distributions in pure-birth process

    International Nuclear Information System (INIS)

    Biyajima, M.; Shirane, K.; Suzuki, N.

    1988-01-01

    Moments in pseudorapidity intervals at the CERN Sp-barpS collider (√s = 546 GeV) are analyzed by means of two probability distributions in the pure-birth stochastic process. Our results show that a probability distribution obtained from the Poisson distribution as an initial condition is more useful than that obtained from the Kronecker δ function. Analyses of moments by Koba-Nielsen-Olesen scaling functions derived from solutions of the pure-birth stochastic process are also made. Moreover, analyses of preliminary data at √s = 200 and 900 GeV are added

  16. Complete cumulative index (1963-1983)

    International Nuclear Information System (INIS)

    1983-01-01

    This complete cumulative index covers all regular and special issues and supplements published by Atomic Energy Review (AER) during its lifetime (1963-1983). The complete cumulative index consists of six Indexes: the Index of Abstracts, the Subject Index, the Title Index, the Author Index, the Country Index and the Table of Elements Index. The complete cumulative index supersedes the Cumulative Indexes for Volumes 1-7: 1963-1969 (1970), and for Volumes 1-10: 1963-1972 (1972); this Index also finalizes Atomic Energy Review, the publication of which has recently been terminated by the IAEA

  17. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    Science.gov (United States)

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Probability distribution of magnetization in the one-dimensional Ising model: effects of boundary conditions

    Energy Technology Data Exchange (ETDEWEB)

    Antal, T [Physics Department, Simon Fraser University, Burnaby, BC V5A 1S6 (Canada); Droz, M [Departement de Physique Theorique, Universite de Geneve, CH 1211 Geneva 4 (Switzerland); Racz, Z [Institute for Theoretical Physics, Eoetvoes University, 1117 Budapest, Pazmany setany 1/a (Hungary)

    2004-02-06

    Finite-size scaling functions are investigated both for the mean-square magnetization fluctuations and for the probability distribution of the magnetization in the one-dimensional Ising model. The scaling functions are evaluated in the limit of the temperature going to zero (T {yields} 0), the size of the system going to infinity (N {yields} {infinity}) while N[1 - tanh(J/k{sub B}T)] is kept finite (J being the nearest neighbour coupling). Exact calculations using various boundary conditions (periodic, antiperiodic, free, block) demonstrate explicitly how the scaling functions depend on the boundary conditions. We also show that the block (small part of a large system) magnetization distribution results are identical to those obtained for free boundary conditions.

  19. Understanding the distinctively skewed and heavy tailed character of atmospheric and oceanic probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Sardeshmukh, Prashant D., E-mail: Prashant.D.Sardeshmukh@noaa.gov [CIRES, University of Colorado, Boulder, Colorado 80309 (United States); NOAA/Earth System Research Laboratory, Boulder, Colorado 80305 (United States); Penland, Cécile [NOAA/Earth System Research Laboratory, Boulder, Colorado 80305 (United States)

    2015-03-15

    The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework.

  20. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    Science.gov (United States)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  1. Influence of random setup error on dose distribution

    International Nuclear Information System (INIS)

    Zhai Zhenyu

    2008-01-01

    Objective: To investigate the influence of random setup error on dose distribution in radiotherapy and determine the margin from ITV to PTV. Methods: A random sample approach was used to simulate the fields position in target coordinate system. Cumulative effect of random setup error was the sum of dose distributions of all individual treatment fractions. Study of 100 cumulative effects might get shift sizes of 90% dose point position. Margins from ITV to PTV caused by random setup error were chosen by 95% probability. Spearman's correlation was used to analyze the influence of each factor. Results: The average shift sizes of 90% dose point position was 0.62, 1.84, 3.13, 4.78, 6.34 and 8.03 mm if random setup error was 1,2,3,4,5 and 6 mm,respectively. Univariate analysis showed the size of margin was associated only by the size of random setup error. Conclusions: Margin of ITV to PTV is 1.2 times random setup error for head-and-neck cancer and 1.5 times for thoracic and abdominal cancer. Field size, energy and target depth, unlike random setup error, have no relation with the size of the margin. (authors)

  2. Multiscale probability distribution of pressure fluctuations in fluidized beds

    International Nuclear Information System (INIS)

    Ghasemi, Fatemeh; Sahimi, Muhammad; Reza Rahimi Tabar, M; Peinke, Joachim

    2012-01-01

    Analysis of flow in fluidized beds, a common chemical reactor, is of much current interest due to its fundamental as well as industrial importance. Experimental data for the successive increments of the pressure fluctuations time series in a fluidized bed are analyzed by computing a multiscale probability density function (PDF) of the increments. The results demonstrate the evolution of the shape of the PDF from the short to long time scales. The deformation of the PDF across time scales may be modeled by the log-normal cascade model. The results are also in contrast to the previously proposed PDFs for the pressure fluctuations that include a Gaussian distribution and a PDF with a power-law tail. To understand better the properties of the pressure fluctuations, we also construct the shuffled and surrogate time series for the data and analyze them with the same method. It turns out that long-range correlations play an important role in the structure of the time series that represent the pressure fluctuation. (paper)

  3. Cumulative interpersonal traumas and social support as risk and resiliency factors in predicting PTSD and depression among inner-city women.

    Science.gov (United States)

    Schumm, Jeremiah A; Briggs-Phillips, Melissa; Hobfoll, Stevan E

    2006-12-01

    This study represents one of the largest examinations of how child abuse, adult rape, and social support impact inner-city women (N = 777). Using retrospective self-report, the effects of interpersonal trauma were shown to be cumulative such that women who experienced either child abuse or adult rape were 6 times more likely to have probable posttraumatic stress disorder (PTSD), whereas women who experienced both child abuse and rape were 17 times more likely to have probable PTSD. High social support predicted lower PTSD severity for women who experienced both child abuse and adult rape, but not for women who reported one or none of these traumas. Results suggest that social support, when left intact, might buffer the cumulative impact of child and adult interpersonal traumas.

  4. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  5. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  6. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  7. Hydrological model calibration for derived flood frequency analysis using stochastic rainfall and probability distributions of peak flows

    Science.gov (United States)

    Haberlandt, U.; Radtke, I.

    2014-01-01

    Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the

  8. Probability distribution of wave packet delay time for strong overlapping of resonance levels

    International Nuclear Information System (INIS)

    Lyuboshits, V.L.

    1983-01-01

    Time behaviour of nuclear reactions in the case of high level densities is investigated basing on the theory of overlapping resonances. In the framework of a model of n equivalent channels an analytical expression is obtained for the probability distribution function for wave packet delay time at the compound nucleus production. It is shown that at strong overlapping of the resonance levels the relative fluctuation of the delay time is small at the stage of compound nucleus production. A possible increase in the duration of nuclear reactions with the excitation energy rise is discussed

  9. Probability distribution of machining center failures

    International Nuclear Information System (INIS)

    Jia Yazhou; Wang Molin; Jia Zhixin

    1995-01-01

    Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed

  10. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  11. An innovative method for offshore wind farm site selection based on the interval number with probability distribution

    Science.gov (United States)

    Wu, Yunna; Chen, Kaifeng; Xu, Hu; Xu, Chuanbo; Zhang, Haobo; Yang, Meng

    2017-12-01

    There is insufficient research relating to offshore wind farm site selection in China. The current methods for site selection have some defects. First, information loss is caused by two aspects: the implicit assumption that the probability distribution on the interval number is uniform; and ignoring the value of decision makers' (DMs') common opinion on the criteria information evaluation. Secondly, the difference in DMs' utility function has failed to receive attention. An innovative method is proposed in this article to solve these drawbacks. First, a new form of interval number and its weighted operator are proposed to reflect the uncertainty and reduce information loss. Secondly, a new stochastic dominance degree is proposed to quantify the interval number with a probability distribution. Thirdly, a two-stage method integrating the weighted operator with stochastic dominance degree is proposed to evaluate the alternatives. Finally, a case from China proves the effectiveness of this method.

  12. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  13. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  15. Fission-fragment mass distribution and estimation of the cluster emission probability in the γ + 232Th and 181Ta reactions

    International Nuclear Information System (INIS)

    Karamyan, S.A.; Adam, J.; Belov, A.G.; Chaloun, P.; Norseev, Yu.V.; Stegajlov, V.I.

    1997-01-01

    Fission-fragment mass distribution has been measured by the cumulative yields of radionuclides detected in the 232 Th(γ,f)-reaction at the Bremsstrahlung endpoint energies of 12 and 24 MeV. The yield upper limits have been estimated for the light nuclei 24 Na, 28 Mg, 38 S etc. at the Th and Ta targets exposure to the 24 MeV Bremsstrahlung. The results are discussed in terms of the multimodal fission phenomena and cluster emission >from a deformed fissioning system or from a compound nucleus

  16. Spatial distribution and occurrence probability of regional new particle formation events in eastern China

    Directory of Open Access Journals (Sweden)

    X. Shen

    2018-01-01

    Full Text Available In this work, the spatial extent of new particle formation (NPF events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ in the North China Plain (NCP, Mt. Tai (TS in central eastern China, and Lin'an (LAN in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100–200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR and new particle formation rate (J than air masses from Inner Mongolia (IM. At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial

  17. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  20. Cumulative t-link threshold models for the genetic analysis of calving ease scores

    Directory of Open Access Journals (Sweden)

    Tempelman Robert J

    2003-09-01

    Full Text Available Abstract In this study, a hierarchical threshold mixed model based on a cumulative t-link specification for the analysis of ordinal data or more, specifically, calving ease scores, was developed. The validation of this model and the Markov chain Monte Carlo (MCMC algorithm was carried out on simulated data from normally and t4 (i.e. a t-distribution with four degrees of freedom distributed populations using the deviance information criterion (DIC and a pseudo Bayes factor (PBF measure to validate recently proposed model choice criteria. The simulation study indicated that although inference on the degrees of freedom parameter is possible, MCMC mixing was problematic. Nevertheless, the DIC and PBF were validated to be satisfactory measures of model fit to data. A sire and maternal grandsire cumulative t-link model was applied to a calving ease dataset from 8847 Italian Piemontese first parity dams. The cumulative t-link model was shown to lead to posterior means of direct and maternal heritabilities (0.40 ± 0.06, 0.11 ± 0.04 and a direct maternal genetic correlation (-0.58 ± 0.15 that were not different from the corresponding posterior means of the heritabilities (0.42 ± 0.07, 0.14 ± 0.04 and the genetic correlation (-0.55 ± 0.14 inferred under the conventional cumulative probit link threshold model. Furthermore, the correlation (> 0.99 between posterior means of sire progeny merit from the two models suggested no meaningful rerankings. Nevertheless, the cumulative t-link model was decisively chosen as the better fitting model for this calving ease data using DIC and PBF.

  1. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  2. Cumulative effects of wind turbines. A guide to assessing the cumulative effects of wind energy development

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This guidance provides advice on how to assess the cumulative effects of wind energy developments in an area and is aimed at developers, planners, and stakeholders interested in the development of wind energy in the UK. The principles of cumulative assessment, wind energy development in the UK, cumulative assessment of wind energy development, and best practice conclusions are discussed. The identification and assessment of the cumulative effects is examined in terms of global environmental sustainability, local environmental quality and socio-economic activity. Supplementary guidance for assessing the principle cumulative effects on the landscape, on birds, and on the visual effect is provided. The consensus building approach behind the preparation of this guidance is outlined in the annexes of the report.

  3. The challenge of cumulative impacts

    Energy Technology Data Exchange (ETDEWEB)

    Masden, Elisabeth

    2011-07-01

    Full text: As governments pledge to combat climate change, wind turbines are becoming a common feature of terrestrial and marine environments. Although wind power is a renewable energy source and a means of reducing carbon emissions, there is a need to ensure that the wind farms themselves do not damage the environment. There is particular concern over the impacts of wind farms on bird populations, and with increasing numbers of wind farm proposals, the concern focuses on cumulative impacts. Individually, a wind farm, or indeed any activity/action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. Cumulative impact assessment is a legislative requirement of environmental impact assessment but such assessments are rarely adequate restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Reasons for this are numerous but a recurring theme is the lack of clear definitions and guidance on how to perform cumulative assessments. Here we present a conceptual framework and include illustrative examples to demonstrate how the framework can be used to improve the planning and execution of cumulative impact assessments. The core concept is that explicit definitions of impacts, actions and scales of assessment are required to reduce uncertainty in the process of assessment and improve communication between stake holders. Only when it is clear what has been included within a cumulative assessment, is it possible to make comparisons between developments. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development assessments. We propose that benefits would be gained from elevating cumulative

  4. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  5. Probability distributions of placental morphological measurements and origins of variability of placental shapes.

    Science.gov (United States)

    Yampolsky, M; Salafia, C M; Shlakhter, O

    2013-06-01

    While the mean shape of human placenta is round with centrally inserted umbilical cord, significant deviations from this ideal are fairly common, and may be clinically meaningful. Traditionally, they are explained by trophotropism. We have proposed a hypothesis explaining typical variations in placental shape by randomly determined fluctuations in the growth process of the vascular tree. It has been recently reported that umbilical cord displacement in a birth cohort has a log-normal probability distribution, which indicates that the displacement between an initial point of origin and the centroid of the mature shape is a result of accumulation of random fluctuations of the dynamic growth of the placenta. To confirm this, we investigate statistical distributions of other features of placental morphology. In a cohort of 1023 births at term digital photographs of placentas were recorded at delivery. Excluding cases with velamentous cord insertion, or missing clinical data left 1001 (97.8%) for which placental surface morphology features were measured. Best-fit statistical distributions for them were obtained using EasyFit. The best-fit distributions of umbilical cord displacement, placental disk diameter, area, perimeter, and maximal radius calculated from the cord insertion point are of heavy-tailed type, similar in shape to log-normal distributions. This is consistent with a stochastic origin of deviations of placental shape from normal. Deviations of placental shape descriptors from average have heavy-tailed distributions similar in shape to log-normal. This evidence points away from trophotropism, and towards a spontaneous stochastic evolution of the variants of placental surface shape features. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Cumulative doses analysis in young trauma patients: a single-centre experience.

    Science.gov (United States)

    Salerno, Sergio; Marrale, Maurizio; Geraci, Claudia; Caruso, Giuseppe; Lo Re, Giuseppe; Lo Casto, Antonio; Midiri, Massimo

    2016-02-01

    Multidetector computed tomography (MDCT) represents the main source of radiation exposure in trauma patients. The radiation exposure of young patients is a matter of considerable medical concern due to possible long-term effects. Multiple MDCT studies have been observed in the young trauma population with an increase in radiation exposure. We have identified 249 young adult patients (178 men and 71 women; age range 14-40 years) who had received more than one MDCT study between June 2010 and June 2014. According to the International Commission on Radiological Protection publication, we have calculated the cumulative organ dose tissue-weighting factors by using CT-EXPO software(®). We have observed a mean cumulative dose of about 27 mSv (range from 3 to 297 mSv). The distribution analysis is characterised by low effective dose, below 20 mSv, in the majority of the patients. However, in 29 patients, the effective dose was found to be higher than 20 mSv. Dose distribution for the various organs analysed (breasts, ovaries, testicles, heart and eye lenses) shows an intense peak for lower doses, but in some cases high doses were recorded. Even though cumulative doses may have long-term effects, which are still under debate, high doses are observed in this specific group of young patients.

  7. The first-passage time distribution for the diffusion model with variable drift

    DEFF Research Database (Denmark)

    Blurton, Steven Paul; Kesselmeier, Miriam; Gondan, Matthias

    2017-01-01

    across trials. This extra flexibility allows accounting for slow errors that often occur in response time experiments. So far, the predicted response time distributions were obtained by numerical evaluation as analytical solutions were not available. Here, we present an analytical expression...... for the cumulative first-passage time distribution in the diffusion model with normally distributed trial-to-trial variability in the drift. The solution is obtained with predefined precision, and its evaluation turns out to be extremely fast.......The Ratcliff diffusion model is now arguably the most widely applied model for response time data. Its major advantage is its description of both response times and the probabilities for correct as well as incorrect responses. The model assumes a Wiener process with drift between two constant...

  8. Cumulative keyboard strokes: a possible risk factor for carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Eleftheriou Andreas

    2012-08-01

    Full Text Available Abstract Background Contradictory reports have been published regarding the association of Carpal Tunnel Syndrome (CTS and the use of computer keyboard. Previous studies did not take into account the cumulative exposure to keyboard strokes among computer workers. The aim of the present study was to investigate the association between cumulative keyboard use (keyboard strokes and CTS. Methods Employees (461 from a Governmental data entry & processing unit agreed to participate (response rate: 84.1 % in a cross-sectional study. Α questionnaire was distributed to the participants to obtain information on socio-demographics and risk factors for CTS. The participants were examined for signs and symptoms related to CTS and were asked if they had previous history or surgery for CTS. The cumulative amount of the keyboard strokes per worker per year was calculated by the use of payroll’s registry. Two case definitions for CTS were used. The first included subjects with personal history/surgery for CTS while the second included subjects that belonged to the first case definition plus those participants were identified through clinical examination. Results Multivariate analysis used for both case definitions, indicated that those employees with high cumulative exposure to keyboard strokes were at increased risk of CTS (case definition A: OR = 2.23;95 % CI = 1.09-4.52 and case definition B: OR = 2.41; 95%CI = 1.36-4.25. A dose response pattern between cumulative exposure to keyboard strokes and CTS has been revealed (p  Conclusions The present study indicated a possible association between cumulative exposure to keyboard strokes and development of CTS. Cumulative exposure to key-board strokes would be taken into account as an exposure indicator regarding exposure assessment of computer workers. Further research is needed in order to test the results of the current study and assess causality between cumulative keyboard strokes and

  9. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  10. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    Science.gov (United States)

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  11. The use of the multi-cumulant tensor analysis for the algorithmic optimisation of investment portfolios

    Science.gov (United States)

    Domino, Krzysztof

    2017-02-01

    The cumulant analysis plays an important role in non Gaussian distributed data analysis. The shares' prices returns are good example of such data. The purpose of this research is to develop the cumulant based algorithm and use it to determine eigenvectors that represent investment portfolios with low variability. Such algorithm is based on the Alternating Least Square method and involves the simultaneous minimisation 2'nd- 6'th cumulants of the multidimensional random variable (percentage shares' returns of many companies). Then the algorithm was tested during the recent crash on the Warsaw Stock Exchange. To determine incoming crash and provide enter and exit signal for the investment strategy the Hurst exponent was calculated using the local DFA. It was shown that introduced algorithm is on average better that benchmark and other portfolio determination methods, but only within examination window determined by low values of the Hurst exponent. Remark that the algorithm is based on cumulant tensors up to the 6'th order calculated for a multidimensional random variable, what is the novel idea. It can be expected that the algorithm would be useful in the financial data analysis on the world wide scale as well as in the analysis of other types of non Gaussian distributed data.

  12. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions.

    Science.gov (United States)

    Wenger, Seth J; Freeman, Mary C

    2008-10-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  13. Probability distribution of distance in a uniform ellipsoid: Theory and applications to physics

    International Nuclear Information System (INIS)

    Parry, Michelle; Fischbach, Ephraim

    2000-01-01

    A number of authors have previously found the probability P n (r) that two points uniformly distributed in an n-dimensional sphere are separated by a distance r. This result greatly facilitates the calculation of self-energies of spherically symmetric matter distributions interacting by means of an arbitrary radially symmetric two-body potential. We present here the analogous results for P 2 (r;ε) and P 3 (r;ε) which respectively describe an ellipse and an ellipsoid whose major and minor axes are 2a and 2b. It is shown that for ε=(1-b 2 /a 2 ) 1/2 ≤1, P 2 (r;ε) and P 3 (r;ε) can be obtained as an expansion in powers of ε, and our results are valid through order ε 4 . As an application of these results we calculate the Coulomb energy of an ellipsoidal nucleus, and compare our result to an earlier result quoted in the literature. (c) 2000 American Institute of Physics

  14. 32 CFR 651.16 - Cumulative impacts.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Cumulative impacts. 651.16 Section 651.16... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present...

  15. Calculation of probability density functions for temperature and precipitation change under global warming

    International Nuclear Information System (INIS)

    Watterson, Ian G.

    2007-01-01

    Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections

  16. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  17. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  18. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  19. Cumulative nucleon production in 3Hep- and 3Hp interactions at momenta of colliding nuclei 5 GeV/c

    International Nuclear Information System (INIS)

    Abdullin, S.K.; Blinov, A.V.; Vanyushin, I.A.

    1989-01-01

    Inclusive cross sections of cumulative protons produced in 3 Hep- and 3 Hp-interactions under momenta of colliding 5 GeV/c nuclei are investigated. Experimental material is obtained using liquid-hydrogen ITEP chamber-80cm. Under the cumulative proton kinetic energy, exceeding 50 MeV inclusive cross section ratio σ( 3 Hep→pX)/σ( 3 Hp→pX)=1.6±0.1. Within the same energy interval evaluation of yield ratio of cumulative protons and neutrons, produced in 3 Hep -interactions, leads to ∼ 1.6 value. Asymmetry of mean multiple protons escaping forward and backward both in 3 Hep -and in 3 Hp interactions is observed. Averaged invariant distribution functions for cumulative protons and neutrons in 8 Hep-interactions and protons in 3 Hp-interactions are constructed (the averaging is performed within 90-180 deg and 120-180 deg intervals). To temperatures of such distributions are found. 43 refs.; 1 fig

  20. Tools for Bramwell-Holdsworth-Pinton Probability Distribution

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2009-01-01

    Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.

  1. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    Science.gov (United States)

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  2. Charged-particle thermonuclear reaction rates: II. Tables and graphs of reaction rates and probability density functions

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.

    2010-01-01

    Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.

  3. Constituent quarks as clusters in quark-gluon-parton model. [Total cross sections, probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Kanki, T [Osaka Univ., Toyonaka (Japan). Coll. of General Education

    1976-12-01

    We present a quark-gluon-parton model in which quark-partons and gluons make clusters corresponding to two or three constituent quarks (or anti-quarks) in the meson or in the baryon, respectively. We explicitly construct the constituent quark state (cluster), by employing the Kuti-Weisskopf theory and by requiring the scaling. The quark additivity of the hadronic total cross sections and the quark counting rules on the threshold powers of various distributions are satisfied. For small x (Feynman fraction), it is shown that the constituent quarks and quark-partons have quite different probability distributions. We apply our model to hadron-hadron inclusive reactions, and clarify that the fragmentation and the diffractive processes relate to the constituent quark distributions, while the processes in or near the central region are controlled by the quark-partons. Our model gives the reasonable interpretation for the experimental data and much improves the usual ''constituent interchange model'' result near and in the central region (x asymptotically equals x sub(T) asymptotically equals 0).

  4. Concise method for evaluating the probability distribution of the marginal cost of power generation

    International Nuclear Information System (INIS)

    Zhang, S.H.; Li, Y.Z.

    2000-01-01

    In the developing electricity market, many questions on electricity pricing and the risk modelling of forward contracts require the evaluation of the expected value and probability distribution of the short-run marginal cost of power generation at any given time. A concise forecasting method is provided, which is consistent with the definitions of marginal costs and the techniques of probabilistic production costing. The method embodies clear physical concepts, so that it can be easily understood theoretically and computationally realised. A numerical example has been used to test the proposed method. (author)

  5. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    Science.gov (United States)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  6. Measurement of temperature distributions in large pool fires with the use of directional flame thermometers

    International Nuclear Information System (INIS)

    Koski, Jorman A.

    2000-01-01

    Temperatures inside the flame zone of large regulatory pool fires measured during tests of radioactive materials packages vary widely with both time and position. Measurements made with several Directional Flame Thermometers, in which a thermocouple is attached to a thin metal sheet that quickly approaches flame temperatures, have been used to construct fire temperature distributions and cumulative probability distributions. As an aid to computer simulations of these large fires, these distributions are presented. The distributions are constructed by sorting fire temperature data into bins 10 C wide. A typical fire temperature distribution curve has a gradual increase starting at about 600 C, with the number of observations increasing to a peak near 1000 C, followed by an abrupt decrease in frequency, with no temperatures observed above 1200 C

  7. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  8. The correlation of defect distribution in collisional phase with measured cascade collapse probability

    International Nuclear Information System (INIS)

    Morishita, K.; Ishino, S.; Sekimura, N.

    1995-01-01

    The spatial distributions of atomic displacement at the end of the collisional phase of cascade damage processes were calculated using the computer simulation code MARLOWE, which is based on the binary collision approximation (BCA). The densities of the atomic displacement were evaluated in high dense regions (HDRs) of cascades in several pure metals (Fe, Ni, Cu, Ag, Au, Mo and W). They were compared with the measured cascade collapse probabilities reported in the literature where TEM observations were carried out using thin metal foils irradiated by low-dose ions at room temperature. We found that there exists the minimum or ''critical'' values of the atomic displacement densities for the HDR to collapse into TEM-visible vacancy clusters. The critical densities are generally independent of the cascade energy in the same metal. Furthermore, the material dependence of the critical densities can be explained by the difference in the vacancy mobility at the melting temperature of target materials. This critical density calibration, which is extracted from the ion-irradiation experiments and the BCA simulations, is applied to estimation of cascade collapse probabilities in the metals irradiated by fusion neutrons. (orig.)

  9. Divergent Cumulative Cultural Evolution

    OpenAIRE

    Marriott, Chris; Chebib, Jobran

    2016-01-01

    Divergent cumulative cultural evolution occurs when the cultural evolutionary trajectory diverges from the biological evolutionary trajectory. We consider the conditions under which divergent cumulative cultural evolution can occur. We hypothesize that two conditions are necessary. First that genetic and cultural information are stored separately in the agent. Second cultural information must be transferred horizontally between agents of different generations. We implement a model with these ...

  10. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  11. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  12. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  13. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  14. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. The Multivariate Gaussian Probability Distribution

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2005-01-01

    This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...

  16. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  17. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  18. Advancing Shannon Entropy for Measuring Diversity in Systems

    Directory of Open Access Journals (Sweden)

    R. Rajaram

    2017-01-01

    Full Text Available From economic inequality and species diversity to power laws and the analysis of multiple trends and trajectories, diversity within systems is a major issue for science. Part of the challenge is measuring it. Shannon entropy H has been used to rethink diversity within probability distributions, based on the notion of information. However, there are two major limitations to Shannon’s approach. First, it cannot be used to compare diversity distributions that have different levels of scale. Second, it cannot be used to compare parts of diversity distributions to the whole. To address these limitations, we introduce a renormalization of probability distributions based on the notion of case-based entropy Cc as a function of the cumulative probability c. Given a probability density p(x, Cc measures the diversity of the distribution up to a cumulative probability of c, by computing the length or support of an equivalent uniform distribution that has the same Shannon information as the conditional distribution of p^c(x up to cumulative probability c. We illustrate the utility of our approach by renormalizing and comparing three well-known energy distributions in physics, namely, the Maxwell-Boltzmann, Bose-Einstein, and Fermi-Dirac distributions for energy of subatomic particles. The comparison shows that Cc is a vast improvement over H as it provides a scale-free comparison of these diversity distributions and also allows for a comparison between parts of these diversity distributions.

  19. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  20. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  1. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  2. Secant cumulants and toric geometry

    NARCIS (Netherlands)

    Michalek, M.; Oeding, L.; Zwiernik, P.W.

    2012-01-01

    We study the secant line variety of the Segre product of projective spaces using special cumulant coordinates adapted for secant varieties. We show that the secant variety is covered by open normal toric varieties. We prove that in cumulant coordinates its ideal is generated by binomial quadrics. We

  3. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  4. Approximation of ruin probabilities via Erlangized scale mixtures

    DEFF Research Database (Denmark)

    Peralta, Oscar; Rojas-Nandayapa, Leonardo; Xie, Wangyue

    2018-01-01

    In this paper, we extend an existing scheme for numerically calculating the probability of ruin of a classical Cramér–Lundbergreserve process having absolutely continuous but otherwise general claim size distributions. We employ a dense class of distributions that we denominate Erlangized scale...... a simple methodology for constructing a sequence of distributions having the form Π⋆G with the purpose of approximating the integrated tail distribution of the claim sizes. Then we adapt a recent result which delivers an explicit expression for the probability of ruin in the case that the claim size...... distribution is modeled as an Erlangized scale mixture. We provide simplified expressions for the approximation of the probability of ruin and construct explicit bounds for the error of approximation. We complement our results with a classical example where the claim sizes are heavy-tailed....

  5. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors

    DEFF Research Database (Denmark)

    Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...... of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse....

  6. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  7. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    significantly due to risk aversion. We characterize an approach for eliciting the entire subjective belief distribution that is minimally biased due to risk aversion. We offer simulated examples to demonstrate the intuition of our approach. We also provide theory to formally characterize our framework. And we...... provide experimental evidence which corroborates our theoretical results. We conclude that for empirically plausible levels of risk aversion, one can reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one...

  8. Using the cumulative sum algorithm against distributed denial of service attacks in Internet of Things

    CSIR Research Space (South Africa)

    Machaka, Pheeha

    2015-11-01

    Full Text Available The paper presents the threats that are present in Internet of Things (IoT) systems and how they can be used to perpetuate a large scale DDoS attack. The paper investigates how the Cumulative Sum (CUSUM) algorithm can be used to detect a DDoS attack...

  9. Influence of nucleon density distribution in nucleon emission probability

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.

    2014-01-01

    Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield

  10. Cumulative Culture and Future Thinking: Is Mental Time Travel a Prerequisite to Cumulative Cultural Evolution?

    Science.gov (United States)

    Vale, G. L.; Flynn, E. G.; Kendal, R. L.

    2012-01-01

    Cumulative culture denotes the, arguably, human capacity to build on the cultural behaviors of one's predecessors, allowing increases in cultural complexity to occur such that many of our cultural artifacts, products and technologies have progressed beyond what a single individual could invent alone. This process of cumulative cultural evolution…

  11. Cumulative life events, traumatic experiences, and psychiatric symptomatology in transition-aged youth with autism spectrum disorder

    OpenAIRE

    Taylor, Julie Lounds; Gotham, Katherine O.

    2016-01-01

    Background Co-occurring mood and anxiety symptomatology is commonly observed among youth with autism spectrum disorders (ASD) during adolescence and adulthood. Yet, little is known about the factors that might predispose youth with ASD to mood and anxiety problems. In this study, we focus on the role of cumulative stressful life events and trauma in co-occurring psychopathology among youth with ASD who are preparing to exit high school. Specifically, we examined the distribution of cumulative...

  12. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  13. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  14. Cumulative prospect theory and mean variance analysis. A rigorous comparison

    OpenAIRE

    Hens, Thorsten; Mayer, Janos

    2012-01-01

    We compare asset allocations derived for cumulative prospect theory(CPT) based on two different methods: Maximizing CPT along the mean–variance efficient frontier and maximizing it without that restriction. We find that with normally distributed returns the difference is negligible. However, using standard asset allocation data of pension funds the difference is considerable. Moreover, with derivatives like call options the restriction to the mean-variance efficient frontier results in a siza...

  15. Cumulative effective and individual organ dose levels in paediatric patients undergoing multiple catheterizations for congenital heart disease

    International Nuclear Information System (INIS)

    Jones, T.P.; Brennan, P.C.; Ryan, E.

    2017-01-01

    This study examines the cumulative radiation dose levels received by a group of children who underwent multiple cardiac catheterisation procedures during the investigation and management of congenital heart disease (CHD). The purpose is to calculate cumulative doses, identify higher dose individuals, outline the inconsistencies with risk assessment and encourage the establishment of dose databases in order to facilitate the longitudinal research necessary to better understand health risks. A retrospective review of patient records for 117 paediatric patients who have undergone two or more cardiac catheterizations for the investigation of CHD was undertaken. This cohort consisted of patients who were catheterised over a period from September 2002 to August 2014. The age distribution was from newborn to 17 y. Archived kerma-area product (P KA ) and fluoroscopy time (T) readings were retrieved and analysed. Cumulative effective and individual organ doses were determined. The cumulative P KA levels ranged from 1.8 to 651.2 Gycm 2 , whilst cumulative effective dose levels varied from 2 to 259 mSv. The cumulative fluoroscopy time was shown to vary from 8.1 to 193.5 min. Median cumulative organ doses ranged from 3 to 94 mGy. Cumulative effective dose levels are highly variable but may exceed 250 mSv. Individual organ and effective dose measurements remain useful for comparison purposes between institutions although current methodologies used for determining lifetime risks are inadequate. (authors)

  16. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  17. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  18. The EPA's human exposure research program for assessing cumulative risk in communities.

    Science.gov (United States)

    Zartarian, Valerie G; Schultz, Bradley D

    2010-06-01

    Communities are faced with challenges in identifying and prioritizing environmental issues, taking actions to reduce their exposures, and determining their effectiveness for reducing human health risks. Additional challenges include determining what scientific tools are available and most relevant, and understanding how to use those tools; given these barriers, community groups tend to rely more on risk perception than science. The U.S. Environmental Protection Agency's Office of Research and Development, National Exposure Research Laboratory (NERL) and collaborators are developing and applying tools (models, data, methods) for enhancing cumulative risk assessments. The NERL's "Cumulative Communities Research Program" focuses on key science questions: (1) How to systematically identify and prioritize key chemical stressors within a given community?; (2) How to develop estimates of exposure to multiple stressors for individuals in epidemiologic studies?; and (3) What tools can be used to assess community-level distributions of exposures for the development and evaluation of the effectiveness of risk reduction strategies? This paper provides community partners and scientific researchers with an understanding of the NERL research program and other efforts to address cumulative community risks; and key research needs and opportunities. Some initial findings include the following: (1) Many useful tools exist for components of risk assessment, but need to be developed collaboratively with end users and made more comprehensive and user-friendly for practical application; (2) Tools for quantifying cumulative risks and impact of community risk reduction activities are also needed; (3) More data are needed to assess community- and individual-level exposures, and to link exposure-related information with health effects; and (4) Additional research is needed to incorporate risk-modifying factors ("non-chemical stressors") into cumulative risk assessments. The products of this

  19. Probability Distributions for Cyclone Key Parameters and Cyclonic Wind Speed for the East Coast of Indian Region

    Directory of Open Access Journals (Sweden)

    Pradeep K. Goyal

    2011-09-01

    Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.

  20. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  1. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  2. 40 CFR 194.34 - Results of performance assessments.

    Science.gov (United States)

    2010-07-01

    ... the probability of exceeding various levels of cumulative release caused by all significant processes and events. (b) Probability distributions for uncertain disposal system parameter values used in... techniques, which draw random samples from across the entire range of the probability distributions developed...

  3. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    International Nuclear Information System (INIS)

    Juang, K.-W.; Lee, D.-Y.; Teng, Y.-L.

    2005-01-01

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging

  4. A probability distribution model of tooth pits for evaluating time-varying mesh stiffness of pitting gears

    Science.gov (United States)

    Lei, Yaguo; Liu, Zongyao; Wang, Delong; Yang, Xiao; Liu, Huan; Lin, Jing

    2018-06-01

    Tooth damage often causes a reduction in gear mesh stiffness. Thus time-varying mesh stiffness (TVMS) can be treated as an indication of gear health conditions. This study is devoted to investigating the mesh stiffness variations of a pair of external spur gears with tooth pitting, and proposes a new model for describing tooth pitting based on probability distribution. In the model, considering the appearance and development process of tooth pitting, we model the pitting on the surface of spur gear teeth as a series of pits with a uniform distribution in the direction of tooth width and a normal distribution in the direction of tooth height, respectively. In addition, four pitting degrees, from no pitting to severe pitting, are modeled. Finally, influences of tooth pitting on TVMS are analyzed in details and the proposed model is validated by comparing with a finite element model. The comparison results show that the proposed model is effective for the TVMS evaluations of pitting gears.

  5. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  6. Probability distribution functions for ELM bursts in a series of JET tokamak discharges

    International Nuclear Information System (INIS)

    Greenhough, J; Chapman, S C; Dendy, R O; Ward, D J

    2003-01-01

    A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour

  7. Cumulative effects assessment: Does scale matter?

    International Nuclear Information System (INIS)

    Therivel, Riki; Ross, Bill

    2007-01-01

    Cumulative effects assessment (CEA) is (or should be) an integral part of environmental assessment at both the project and the more strategic level. CEA helps to link the different scales of environmental assessment in that it focuses on how a given receptor is affected by the totality of plans, projects and activities, rather than on the effects of a particular plan or project. This article reviews how CEAs consider, and could consider, scale issues: spatial extent, level of detail, and temporal issues. It is based on an analysis of Canadian project-level CEAs and UK strategic-level CEAs. Based on a review of literature and, especially, case studies with which the authors are familiar, it concludes that scale issues are poorly considered at both levels, with particular problems being unclear or non-existing cumulative effects scoping methodologies; poor consideration of past or likely future human activities beyond the plan or project in question; attempts to apportion 'blame' for cumulative effects; and, at the plan level, limited management of cumulative effects caused particularly by the absence of consent regimes. Scale issues are important in most of these problems. However both strategic-level and project-level CEA have much potential for managing cumulative effects through better siting and phasing of development, demand reduction and other behavioural changes, and particularly through setting development consent rules for projects. The lack of strategic resource-based thresholds constrains the robust management of strategic-level cumulative effects

  8. Cumulative radiation effect

    International Nuclear Information System (INIS)

    Kirk, J.; Cain, O.; Gray, W.M.

    1977-01-01

    Cumulative Radiation Effect (CRE) represents a scale of accumulative sub-tolerance radiation damage, with a unique value of the CRE describing a specific level of radiation effect. Computer calculations have been used to simplify the evaluation of problems associated with the applications of the CRE-system in radiotherapy. In a general appraisal of the applications of computers to the CRE-system, the various problems encountered in clinical radiotherapy have been categorised into those involving the evaluation of a CRE at a point in tissue and those involving the calculation of CRE distributions. As a general guide, the computer techniques adopted at the Glasgow Institute of Radiotherapeutics for the solution of CRE problems are presented, and consist basically of a package of three interactive programs for point CRE calculations and a Fortran program which calculates CRE distributions for iso-effect treatment planning. Many examples are given to demonstrate the applications of these programs, and special emphasis has been laid on the problem of treating a point in tissue with different doses per fraction on alternate treatment days. The wide range of possible clinical applications of the CRE-system has been outlined and described under the categories of routine clinical applications, retrospective and prospective surveys of patient treatment, and experimental and theoretical research. Some of these applications such as the results of surveys and studies of time optimisation of treatment schedules could have far-reaching consequences and lead to significant improvements in treatment and cure rates with the minimum damage to normal tissue. (author)

  9. Calculating the Prior Probability Distribution for a Causal Network Using Maximum Entropy: Alternative Approaches

    Directory of Open Access Journals (Sweden)

    Michael J. Markham

    2011-07-01

    Full Text Available Some problems occurring in Expert Systems can be resolved by employing a causal (Bayesian network and methodologies exist for this purpose. These require data in a specific form and make assumptions about the independence relationships involved. Methodologies using Maximum Entropy (ME are free from these conditions and have the potential to be used in a wider context including systems consisting of given sets of linear and independence constraints, subject to consistency and convergence. ME can also be used to validate results from the causal network methodologies. Three ME methods for determining the prior probability distribution of causal network systems are considered. The first method is Sequential Maximum Entropy in which the computation of a progression of local distributions leads to the over-all distribution. This is followed by development of the Method of Tribus. The development takes the form of an algorithm that includes the handling of explicit independence constraints. These fall into two groups those relating parents of vertices, and those deduced from triangulation of the remaining graph. The third method involves a variation in the part of that algorithm which handles independence constraints. Evidence is presented that this adaptation only requires the linear constraints and the parental independence constraints to emulate the second method in a substantial class of examples.

  10. Cumulative cultural learning: Development and diversity

    Science.gov (United States)

    2017-01-01

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children’s learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission—the cornerstone of human cultural diversity. PMID:28739945

  11. Cumulative cultural learning: Development and diversity.

    Science.gov (United States)

    Legare, Cristine H

    2017-07-24

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children's learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission-the cornerstone of human cultural diversity.

  12. A full-angle Monte-Carlo scattering technique including cumulative and single-event Rutherford scattering in plasmas

    Science.gov (United States)

    Higginson, Drew P.

    2017-11-01

    We describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event. We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10-3 to 0.3-0.7; the upper limit corresponds to Coulomb logarithm of 20-2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.

  13. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  14. Assignment of probability distributions for parameters in the 1996 performance assessment for the Waste Isolation Pilot Plant. Part 1: description of process

    International Nuclear Information System (INIS)

    Rechard, Rob P.; Tierney, Martin S.

    2005-01-01

    A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The key to the success of the process was the use of a three-member team consisting of a Parameter Task Leader, PA Analyst, and Subject Matter Expert. This team, in turn, relied upon a series of guidelines for selecting distribution types. The primary function of the guidelines was not to constrain the actual process of developing a parameter distribution but rather to establish a series of well-defined steps where recognized methods would be consistently applied to all parameters. An important guideline was to use a small set of distributions satisfying the maximum entropy formalism. Another important guideline was the consistent use of the log transform for parameters with large ranges (i.e. maximum/minimum>10 3 ). A parameter development team assigned 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA using these and other guidelines described

  15. An exact power series formula of the outage probability with noise and interference over generalized fading channels

    KAUST Repository

    Rached, Nadhir B.

    2016-12-24

    In this paper, we develop a generalized momentbased approach for the evaluation of the outage probability (OP) in the presence of co-channel interference and additive white Gaussian noise. The proposed method allows the evaluation of the OP of the signal-to-interference-plus-noise ratio by a power series expansion in the threshold value. Its main advantage is that it does not require a particular distribution for the interference channels. The only necessary ingredients are a power series expansion for the cumulative distribution function of the desired user power and the cross-moments of the interferers\\' powers. These requirements are easily met in many practical fading models, for which the OP might not be obtained in closed-form expression. For a sake of illustration, we consider the application of our method to the Rician fading environment. Under this setting, we carry out a convergence study of the proposed power series and corroborate the validity of our method for different values of fading parameters and various numbers of co-channel interferers.

  16. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  17. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  18. Reduction of CMIP5 models bias using Cumulative Distribution Function transform and impact on crops yields simulations across West Africa.

    Science.gov (United States)

    Moise Famien, Adjoua; Defrance, Dimitri; Sultan, Benjamin; Janicot, Serge; Vrac, Mathieu

    2017-04-01

    Different CMIP exercises show that the simulations of the future/current temperature and precipitation are complex with a high uncertainty degree. For example, the African monsoon system is not correctly simulated and most of the CMIP5 models underestimate the precipitation. Therefore, Global Climate Models (GCMs) show significant systematic biases that require bias correction before it can be used in impacts studies. Several methods of bias corrections have been developed for several years and are increasingly using more complex statistical methods. The aims of this work is to show the interest of the CDFt (Cumulative Distribution Function transfom (Michelangeli et al.,2009)) method to reduce the data bias from 29 CMIP5 GCMs over Africa and to assess the impact of bias corrected data on crop yields prediction by the end of the 21st century. In this work, we apply the CDFt to daily data covering the period from 1950 to 2099 (Historical and RCP8.5) and we correct the climate variables (temperature, precipitation, solar radiation, wind) by the use of the new daily database from the EU project WATer and global CHange (WATCH) available from 1979 to 2013 as reference data. The performance of the method is assessed in several cases. First, data are corrected based on different calibrations periods and are compared, on one hand, with observations to estimate the sensitivity of the method to the calibration period and, on other hand, with another bias-correction method used in the ISIMIP project. We find that, whatever the calibration period used, CDFt corrects well the mean state of variables and preserves their trend, as well as daily rainfall occurrence and intensity distributions. However, some differences appear when compared to the outputs obtained with the method used in ISIMIP and show that the quality of the correction is strongly related to the reference data. Secondly, we validate the bias correction method with the agronomic simulations (SARRA-H model (Kouressy

  19. A SAS-macro for estimation of the cumulative incidence using Poisson regression

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    2009-01-01

    the hazard rates, and the hazard rates are often estimated by the Cox regression. This procedure may not be suitable for large studies due to limited computer resources. Instead one uses Poisson regression, which approximates the Cox regression. Rosthøj et al. presented a SAS-macro for the estimation...... of the cumulative incidences based on the Cox regression. I present the functional form of the probabilities and variances when using piecewise constant hazard rates and a SAS-macro for the estimation using Poisson regression. The use of the macro is demonstrated through examples and compared to the macro presented...

  20. Pattern recognition in spaces of probability distributions for the analysis of edge-localized modes in tokamak plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Shabbir, Aqsa

    2016-07-07

    In this doctoral work, pattern recognition techniques are developed and applied to data from tokamak plasmas, in order to contribute to a systematic analysis of edge-localized modes (ELMs). We employ probabilistic models for a quantitative data description geared towards an enhanced systematization of ELM phenomenology. Hence, we start from the point of view that the fundamental object resulting from the observation of a system is a probability distribution, with every single measurement providing a sample from this distribution. In exploring the patterns emerging from the various ELM regimes and relations, we need methods that can handle the intrinsic probabilistic nature of the data. The original contributions of this work are twofold. First, several novel pattern recognition methods in non-Euclidean spaces of probability distribution functions (PDFs) are developed and validated. The second main contribution lies in the application of these and other techniques to a systematic analysis of ELMs in tokamak plasmas. In regard to the methodological aims of the work, we employ the framework of information geometry to develop pattern visualization and classification methods in spaces of probability distributions. In information geometry, a family of probability distributions is considered as a Riemannian manifold. Every point on the manifold represents a single PDF and the distribution parameters provide local coordinates on the manifold. The Fisher information plays the role of a Riemannian metric tensor, enabling calculation of geodesic curves on the surface. The length of such curves yields the geodesic distance (GD) on probabilistic manifolds, which is a natural similarity (distance) measure between PDFs. Equipped with a suitable distance measure, we extrapolate several distance-based pattern recognition methods to the manifold setting. This includes k-nearest neighbor (kNN) and conformal predictor (CP) methods for classification, as well as multidimensional

  1. Pattern recognition in spaces of probability distributions for the analysis of edge-localized modes in tokamak plasmas

    International Nuclear Information System (INIS)

    Shabbir, Aqsa

    2016-01-01

    In this doctoral work, pattern recognition techniques are developed and applied to data from tokamak plasmas, in order to contribute to a systematic analysis of edge-localized modes (ELMs). We employ probabilistic models for a quantitative data description geared towards an enhanced systematization of ELM phenomenology. Hence, we start from the point of view that the fundamental object resulting from the observation of a system is a probability distribution, with every single measurement providing a sample from this distribution. In exploring the patterns emerging from the various ELM regimes and relations, we need methods that can handle the intrinsic probabilistic nature of the data. The original contributions of this work are twofold. First, several novel pattern recognition methods in non-Euclidean spaces of probability distribution functions (PDFs) are developed and validated. The second main contribution lies in the application of these and other techniques to a systematic analysis of ELMs in tokamak plasmas. In regard to the methodological aims of the work, we employ the framework of information geometry to develop pattern visualization and classification methods in spaces of probability distributions. In information geometry, a family of probability distributions is considered as a Riemannian manifold. Every point on the manifold represents a single PDF and the distribution parameters provide local coordinates on the manifold. The Fisher information plays the role of a Riemannian metric tensor, enabling calculation of geodesic curves on the surface. The length of such curves yields the geodesic distance (GD) on probabilistic manifolds, which is a natural similarity (distance) measure between PDFs. Equipped with a suitable distance measure, we extrapolate several distance-based pattern recognition methods to the manifold setting. This includes k-nearest neighbor (kNN) and conformal predictor (CP) methods for classification, as well as multidimensional

  2. Cumulative human impacts on marine predators

    DEFF Research Database (Denmark)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact...

  3. MOCARS: a Monte Carlo code for determining distribution and simulation limits and ranking system components by importance

    International Nuclear Information System (INIS)

    Matthews, S.D.; Poloski, J.P.

    1978-08-01

    MOCARS is a computer program designed for use on the Idaho National Engineering Laboratory (INEL) CDC CYBER 76-173 computer system that uses Monte Carlo techniques to determine the distribution and simulation limits for a function. In use, the MOCARS program randomly samples data from any of the 12 different user-specified probability distributions and either evaluates a user-specified function or cut set system unavailability using the sample data. After data ordering, the values at various quantities and associated confidence bounds are calculated for output. If the cut set unavailability function is evaluated, MOCARS can determine the importance ranking for components in the unavailability calculation. Frequency and cumulative distribution histograms from the sample data are also available for output on microfilm. 39 figures, 4 tables

  4. Investigation of Probability Distributions Using Dice Rolling Simulation

    Science.gov (United States)

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  5. Effect of cumulative strain on texture characteristics during wire drawing of eutectoid steels

    International Nuclear Information System (INIS)

    Yang, F.; Ma, C.; Jiang, J.Q.; Feng, H.P.; Zhai, S.Y.

    2008-01-01

    The texture characteristics associated with plastic deformation of Fe-C steels near-eutectoid composition during a continuous cold drawing process were thoroughly investigated by orientation distribution function analysis based on X-ray diffraction. The effect of cumulative drawing strains on the fiber texture in drawn hypereutectoid and hypoeutectoid steel wires was discussed

  6. Performance Probability Distributions for Sediment Control Best Management Practices

    Science.gov (United States)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  7. A discussion on the origin of quantum probabilities

    International Nuclear Information System (INIS)

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-01

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases

  8. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    Science.gov (United States)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  9. An analysis of cumulative risks based on biomonitoring data for six phthalates using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single chemical drives the cumulative risk of an individual exposed to multiple chemicals. Phthalates are a class of chemicals with ubiquitous exposures in the general population that have the potential to cause ...

  10. Wave functions and two-electron probability distributions of the Hooke's-law atom and helium

    International Nuclear Information System (INIS)

    O'Neill, Darragh P.; Gill, Peter M. W.

    2003-01-01

    The Hooke's-law atom (hookium) provides an exactly soluble model for a two-electron atom in which the nuclear-electron Coulombic attraction has been replaced by a harmonic one. Starting from the known exact position-space wave function for the ground state of hookium, we present the momentum-space wave function. We also look at the intracules, two-electron probability distributions, for hookium in position, momentum, and phase space. These are compared with the Hartree-Fock results and the Coulomb holes (the difference between the exact and Hartree-Fock intracules) in position, momentum, and phase space are examined. We then compare these results with analogous results for the ground state of helium using a simple, explicitly correlated wave function

  11. Cumulative protons in 12C fragmentation at intermediate energy

    International Nuclear Information System (INIS)

    Abramov, B.M.; Alekseev, P.N.; Borodin, Y.A.; Bulychjov, S.A.; Dukhovskoi, I.A.; Khanov, A.I.; Krutenkova, A.P.; Kulikov, V.V.; Martemianov, M.A.; Matsuk, M.A.; Turdakina, E.N.

    2014-01-01

    In the FRAGM experiment at heavy ion accelerator complex TWAC-ITEP, the proton yields at an angle 3.5 degrees have been measured in fragmentation of carbon ions at T 0 equals 0.3, 0.6, 0.95 and 2.0 GeV/nucleon on beryllium target. The data are presented as invariant proton yields on cumulative variable x in the range 0.9 < x < 2.4. Proton spectra cover six orders of invariant cross section magnitude. They have been analyzed in the framework of quark cluster fragmentation model. Fragmentation functions of quark- gluon string model are used. The probabilities of the existence of multi-quark clusters in carbon nuclei are estimated to be 8 - 12% for six-quark clusters and 0.2 - 0.6% for nine- quark clusters. (authors)

  12. The cumulative risk of false-positive screening results across screening centres in the Norwegian Breast Cancer Screening Program

    Energy Technology Data Exchange (ETDEWEB)

    Roman, M., E-mail: Marta.Roman@kreftregisteret.no [Cancer Registry of Norway, Oslo (Norway); Department of Women and Children’s Health, Oslo University Hospital, Oslo (Norway); Skaane, P., E-mail: PERSK@ous-hf.no [Department of Radiology, Oslo University Hospital Ullevaal, University of Oslo, Oslo (Norway); Hofvind, S., E-mail: Solveig.Hofvind@kreftregisteret.no [Cancer Registry of Norway, Oslo (Norway); Oslo and Akershus University College of Applied Sciences, Faculty of Health Science, Oslo (Norway)

    2014-09-15

    Highlights: • We found variation in early performance measures across screening centres. • Radiologists’ performance may play a key role in the variability. • Potential to improve the effectiveness of breast cancer screening programs. • Continuous surveillance of screening centres and radiologists is essential. - Abstract: Background: Recall for assessment in mammographic screening entails an inevitable number of false-positive screening results. This study aimed to investigate the variation in the cumulative risk of a false positive screening result and the positive predictive value across the screening centres in the Norwegian Breast Cancer Screening Program. Methods: We studied 618,636 women aged 50–69 years who underwent 2,090,575 screening exams (1996–2010. Recall rate, positive predictive value, rate of screen-detected cancer, and the cumulative risk of a false positive screening result, without and with invasive procedures across the screening centres were calculated. Generalized linear models were used to estimate the probability of a false positive screening result and to compute the cumulative false-positive risk for up to ten biennial screening examinations. Results: The cumulative risk of a false-positive screening exam varied from 10.7% (95% CI: 9.4–12.0%) to 41.5% (95% CI: 34.1–48.9%) across screening centres, with a highest to lowest ratio of 3.9 (95% CI: 3.7–4.0). The highest to lowest ratio for the cumulative risk of undergoing an invasive procedure with a benign outcome was 4.3 (95% CI: 4.0–4.6). The positive predictive value of recall varied between 12.0% (95% CI: 11.0–12.9%) and 19.9% (95% CI: 18.3–21.5%), with a highest to lowest ratio of 1.7 (95% CI: 1.5–1.9). Conclusions: A substantial variation in the performance measures across the screening centres in the Norwegian Breast Cancer Screening Program was identified, despite of similar administration, procedures, and quality assurance requirements. Differences in the

  13. The cumulative risk of false-positive screening results across screening centres in the Norwegian Breast Cancer Screening Program

    International Nuclear Information System (INIS)

    Roman, M.; Skaane, P.; Hofvind, S.

    2014-01-01

    Highlights: • We found variation in early performance measures across screening centres. • Radiologists’ performance may play a key role in the variability. • Potential to improve the effectiveness of breast cancer screening programs. • Continuous surveillance of screening centres and radiologists is essential. - Abstract: Background: Recall for assessment in mammographic screening entails an inevitable number of false-positive screening results. This study aimed to investigate the variation in the cumulative risk of a false positive screening result and the positive predictive value across the screening centres in the Norwegian Breast Cancer Screening Program. Methods: We studied 618,636 women aged 50–69 years who underwent 2,090,575 screening exams (1996–2010. Recall rate, positive predictive value, rate of screen-detected cancer, and the cumulative risk of a false positive screening result, without and with invasive procedures across the screening centres were calculated. Generalized linear models were used to estimate the probability of a false positive screening result and to compute the cumulative false-positive risk for up to ten biennial screening examinations. Results: The cumulative risk of a false-positive screening exam varied from 10.7% (95% CI: 9.4–12.0%) to 41.5% (95% CI: 34.1–48.9%) across screening centres, with a highest to lowest ratio of 3.9 (95% CI: 3.7–4.0). The highest to lowest ratio for the cumulative risk of undergoing an invasive procedure with a benign outcome was 4.3 (95% CI: 4.0–4.6). The positive predictive value of recall varied between 12.0% (95% CI: 11.0–12.9%) and 19.9% (95% CI: 18.3–21.5%), with a highest to lowest ratio of 1.7 (95% CI: 1.5–1.9). Conclusions: A substantial variation in the performance measures across the screening centres in the Norwegian Breast Cancer Screening Program was identified, despite of similar administration, procedures, and quality assurance requirements. Differences in the

  14. Chapter 19. Cumulative watershed effects and watershed analysis

    Science.gov (United States)

    Leslie M. Reid

    1998-01-01

    Cumulative watershed effects are environmental changes that are affected by more than.one land-use activity and that are influenced by.processes involving the generation or transport.of water. Almost all environmental changes are.cumulative effects, and almost all land-use.activities contribute to cumulative effects

  15. Modelling Distribution Function of Surface Ozone Concentration for Selected Suburban Areas in Malaysia

    International Nuclear Information System (INIS)

    Muhammad Izwan Zariq Mokhtar; Nurul Adyani Ghazali; Muhammad Yazid Nasir; Norhazlina Suhaimi

    2016-01-01

    Ozone is known as an important secondary pollutant in the atmosphere. The aim of this study is to find the best fit distribution for calculating exceedance and return period of ozone based on suburban areas; Perak (AMS1) and Pulau Pinang (AMS2). Three distributions namely Gamma, Rayleigh and Laplace were used to fit 2 years ozone data (2010 and 2011). The parameters were estimated by using Maximum Likelihood Estimation (MLE) in order to plot probability distribution function (PDF) and cumulative distribution function (CDF). Four performance indicators were used to find the best distribution namely, normalized absolute error (NAE), prediction accuracy (PA), coefficient of determination (R 2 ) and root mean square error (RMSE). The best distribution to represent ozone concentration at both sites in 2010 and 2011 is Gamma distribution with the smallest error measure (NAE and RMSE) and the highest adequacy measure (PA and R 2 ). For the 2010 data, AMS1 was predicted to exceed 0.1 ppm for 2 days in 2011 with a return period of one occurrence. (author)

  16. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  17. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. An Analysis of Cumulative Risks Indicated by Biomonitoring Data of Six Phthalates Using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...

  19. Optimal design of unit hydrographs using probability distribution and ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    optimization formulation is solved using binary-coded genetic algorithms. The number of variables to ... Unit hydrograph; rainfall-runoff; hydrology; genetic algorithms; optimization; probability ..... Application of the model. Data derived from the ...

  20. Probabilistic analysis of glass elements with three-parameter Weibull distribution

    International Nuclear Information System (INIS)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-01-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  1. Optimizing Capacities of Distributed Generation and Energy Storage in a Small Autonomous Power System Considering Uncertainty in Renewables

    Directory of Open Access Journals (Sweden)

    Ying-Yi Hong

    2015-03-01

    Full Text Available This paper explores real power generation planning, considering distributed generation resources and energy storage in a small standalone power system. On account of the Kyoto Protocol and Copenhagen Accord, wind and photovoltaic (PV powers are considered as clean and renewable energies. In this study, a genetic algorithm (GA was used to determine the optimal capacities of wind-turbine-generators, PV, diesel generators and energy storage in a small standalone power system. The investment costs (installation, unit and maintenance costs of the distributed generation resources and energy storage and the cost of fuel for the diesel generators were minimized while the reliability requirement and CO2 emission limit were fulfilled. The renewable sources and loads were modeled by random variables because of their uncertainties. The equality and inequality constraints in the genetic algorithms were treated by cumulant effects and cumulative probability of random variables, respectively. The IEEE reliability data for an 8760 h load profile with a 150 kW peak load were used to demonstrate the applicability of the proposed method.

  2. The Bayesian count rate probability distribution in measurement of ionizing radiation by use of a ratemeter

    Energy Technology Data Exchange (ETDEWEB)

    Weise, K.

    2004-06-01

    Recent metrological developments concerning measurement uncertainty, founded on Bayesian statistics, give rise to a revision of several parts of the DIN 25482 and ISO 11929 standard series. These series stipulate detection limits and decision thresholds for ionizing-radiation measurements. Part 3 and, respectively, part 4 of them deal with measurements by use of linear-scale analogue ratemeters. A normal frequency distribution of the momentary ratemeter indication for a fixed count rate value is assumed. The actual distribution, which is first calculated numerically by solving an integral equation, differs, however, considerably from the normal distribution although this one represents an approximation of it for sufficiently large values of the count rate to be measured. As is shown, this similarly holds true for the Bayesian probability distribution of the count rate for sufficiently large given measured values indicated by the ratemeter. This distribution follows from the first one mentioned by means of the Bayes theorem. Its expectation value and variance are needed for the standards to be revised on the basis of Bayesian statistics. Simple expressions are given by the present standards for estimating these parameters and for calculating the detection limit and the decision threshold. As is also shown, the same expressions can similarly be used as sufficient approximations by the revised standards if, roughly, the present indicated value exceeds the reciprocal ratemeter relaxation time constant. (orig.)

  3. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  4. Managing cumulative impacts: A key to sustainability?

    Energy Technology Data Exchange (ETDEWEB)

    Hunsaker, C.T.

    1994-12-31

    This paper addresses how science can be more effectively used in creating policy to manage cumulative effects on ecosystems. The paper focuses on the scientific techniques that we have to identify and to assess cumulative impacts on ecosystems. The term ``sustainable development`` was brought into common use by the World Commission on Environment and Development (The Brundtland Commission) in 1987. The Brundtland Commission report highlighted the need to simultaneously address developmental and environmental imperatives simultaneously by calling for development that ``meets the needs of the present generation without compromising the needs of future generations.`` We cannot claim to be working toward sustainable development until we can quantitatively assess cumulative impacts on the environment: The two concepts are inextricibally linked in that the elusiveness of cumulative effects likely has the greatest potential of keeping us from achieving sustainability. In this paper, assessment and management frameworks relevant to cumulative impacts are discussed along with recent literature on how to improve such assessments. When possible, examples are given for marine ecosystems.

  5. Single-Word Predictions of Upcoming Language During Comprehension: Evidence from the Cumulative Semantic Interference Task

    Science.gov (United States)

    Kleinman, Daniel; Runnqvist, Elin; Ferreira, Victor S.

    2015-01-01

    Comprehenders predict upcoming speech and text on the basis of linguistic input. How many predictions do comprehenders make for an upcoming word? If a listener strongly expects to hear the word “sock”, is the word “shirt” partially expected as well, is it actively inhibited, or is it ignored? The present research addressed these questions by measuring the “downstream” effects of prediction on the processing of subsequently presented stimuli using the cumulative semantic interference paradigm. In three experiments, subjects named pictures (sock) that were presented either in isolation or after strongly constraining sentence frames (“After doing his laundry, Mark always seemed to be missing one…”). Naming sock slowed the subsequent naming of the picture shirt – the standard cumulative semantic interference effect. However, although picture naming was much faster after sentence frames, the interference effect was not modulated by the context (bare vs. sentence) in which either picture was presented. According to the only model of cumulative semantic interference that can account for such a pattern of data, this indicates that comprehenders pre-activated and maintained the pre-activation of best sentence completions (sock) but did not maintain the pre-activation of less likely completions (shirt). Thus, comprehenders predicted only the most probable completion for each sentence. PMID:25917550

  6. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  7. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of five consulting hydrologists

    International Nuclear Information System (INIS)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The Columbia River basalts underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are the effective porosity of the Cohassett basalt flow top and flow interior and the vertical-to-horizontal hydraulic conductivity, or anisotropy ratio, of the Cohassett basalt flow interior. The Cohassett basalt flow is the prime candidate horizon for repository studies. Site-specific data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. The results indicate significant differences of expert opinion for cumulative probabilities of less than 10% and greater than 90%, but relatively close agreement in the middle ranges of values. The principal causes of the diversity of opinion are believed to be the lack of site-specific data and the absence of a single, widely accepted, conceptual or theoretical basis for analyzing these variables

  8. Evaluation of the non-Gaussianity of two-mode entangled states over a bosonic memory channel via cumulant theory and quadrature detection

    Science.gov (United States)

    Xiang, Shao-Hua; Wen, Wei; Zhao, Yu-Jing; Song, Ke-Hui

    2018-04-01

    We study the properties of the cumulants of multimode boson operators and introduce the phase-averaged quadrature cumulants as the measure of the non-Gaussianity of multimode quantum states. Using this measure, we investigate the non-Gaussianity of two classes of two-mode non-Gaussian states: photon-number entangled states and entangled coherent states traveling in a bosonic memory quantum channel. We show that such a channel can skew the distribution of two-mode quadrature variables, giving rise to a strongly non-Gaussian correlation. In addition, we provide a criterion to determine whether the distributions of these states are super- or sub-Gaussian.

  9. Multi-objective optimal power flow for active distribution network considering the stochastic characteristic of photovoltaic

    Science.gov (United States)

    Zhou, Bao-Rong; Liu, Si-Liang; Zhang, Yong-Jun; Yi, Ying-Qi; Lin, Xiao-Ming

    2017-05-01

    To mitigate the impact on the distribution networks caused by the stochastic characteristic and high penetration of photovoltaic, a multi-objective optimal power flow model is proposed in this paper. The regulation capability of capacitor, inverter of photovoltaic and energy storage system embedded in active distribution network are considered to minimize the expected value of active power the T loss and probability of voltage violation in this model. Firstly, a probabilistic power flow based on cumulant method is introduced to calculate the value of the objectives. Secondly, NSGA-II algorithm is adopted for optimization to obtain the Pareto optimal solutions. Finally, the best compromise solution can be achieved through fuzzy membership degree method. By the multi-objective optimization calculation of IEEE34-node distribution network, the results show that the model can effectively improve the voltage security and economy of the distribution network on different levels of photovoltaic penetration.

  10. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  11. Cumulative Impact Assessment: Approaching Environmental Capacity in Development Area Using Environmental Impact Assessment Information

    Science.gov (United States)

    Cho, N.; Lee, M. J.; Maeng, J. H.

    2017-12-01

    Environmental impact assessment estimates the impact of development as a business unit and establishes mitigation plan. If the development is done, its economic effects can spread to the nearby areas. So that various developments can be distributed at different time intervals. The impact of the new developments can be combined with existing environmental impacts and can have a larger impact. That is, Cumulative impact assessment is needed to consider the environmental capacity of the Nearby area. Cumulative impact assessments require policy tools such as environmental impact assessment information and cumulative impact estimation models. In Korea, environmental information (water quality, air quality, etc.) of the development site is measured for environmental impact assessment and monitored for a certain period (generally 5 years) after the project. In addition, by constructing the environmental information as a spatial database, it is possible to express the environmental impact on a regional basis spatially and to intuitively use it for development site selection. Utilizing a composite model of environmental impact assessment information and Remote Sensing data for cumulative impact estimation, That can be used as a policy decision support tool that provides quantitative information for development area management, such as time series effect and sprawl phenomenon.

  12. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  13. Probability of failure prediction for step-stress fatigue under sine or random stress

    Science.gov (United States)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  14. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  15. Performance Analysis of 5G Transmission over Fading Channels with Random IG Distributed LOS Components

    Directory of Open Access Journals (Sweden)

    Dejan Jaksic

    2017-01-01

    Full Text Available Mathematical modelling of the behavior of the radio propagation at mmWave bands is crucial to the development of transmission and reception algorithms of new 5G systems. In this study we will model 5G propagation in nondeterministic line-of-sight (LOS conditions, when the random nature of LOS component ratio will be observed as Inverse Gamma (IG distributed process. Closed-form expressions will be presented for the probability density function (PDF and cumulative distribution function (CDF of such random process. Further, closed-form expressions will be provided for important performance measures such as level crossing rate (LCR and average fade duration (AFD. Capitalizing on proposed expressions, LCR and AFD will be discussed in the function of transmission parameters.

  16. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    Science.gov (United States)

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  17. El Carreto o Cumulá - Aspidosperma Dugandii Standl El Carreto o Cumulá - Aspidosperma Dugandii Standl

    Directory of Open Access Journals (Sweden)

    Dugand Armando

    1944-03-01

    Full Text Available Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin duda alguna al A. Dugandii StandI. Por otra parte, Santiago Cortés (FI. Colomb. 206. 1898; ed, 2: 239. 1912 cita el Cumulá "de Anapoima y otros lugares del (rio Magdalena" diciendo que pertenece a las Leguminosas, pero la brevísima descripción que este autor hace de la madera "naranjada y notable por densidad, dureza y resistencia a la humedad", me induce a creer que se trata del mismo Cumula coleccionado recientemente en Tocaima, ya que esta población esta situada a pocos kilómetros de Anapoima. Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin

  18. Normal tissue complication probabilities: dependence on choice of biological model and dose-volume histogram reduction scheme

    International Nuclear Information System (INIS)

    Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake

    2000-01-01

    Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use

  19. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  20. The effects of radiotherapy treatment uncertainties on the delivered dose distribution and tumour control probability

    International Nuclear Information System (INIS)

    Booth, J.T.; Zavgorodni, S.F.; Royal Adelaide Hospital, SA

    2001-01-01

    Uncertainty in the precise quantity of radiation dose delivered to tumours in external beam radiotherapy is present due to many factors, and can result in either spatially uniform (Gaussian) or spatially non-uniform dose errors. These dose errors are incorporated into the calculation of tumour control probability (TCP) and produce a distribution of possible TCP values over a population. We also study the effect of inter-patient cell sensitivity heterogeneity on the population distribution of patient TCPs. This study aims to investigate the relative importance of these three uncertainties (spatially uniform dose uncertainty, spatially non-uniform dose uncertainty, and inter-patient cell sensitivity heterogeneity) on the delivered dose and TCP distribution following a typical course of fractionated external beam radiotherapy. The dose distributions used for patient treatments are modelled in one dimension. Geometric positioning uncertainties during and before treatment are considered as shifts of a pre-calculated dose distribution. Following the simulation of a population of patients, distributions of dose across the patient population are used to calculate mean treatment dose, standard deviation in mean treatment dose, mean TCP, standard deviation in TCP, and TCP mode. These parameters are calculated with each of the three uncertainties included separately. The calculations show that the dose errors in the tumour volume are dominated by the spatially uniform component of dose uncertainty. This could be related to machine specific parameters, such as linear accelerator calibration. TCP calculation is affected dramatically by inter-patient variation in the cell sensitivity and to a lesser extent by the spatially uniform dose errors. The positioning errors with the 1.5 cm margins used cause dose uncertainty outside the tumour volume and have a small effect on mean treatment dose (in the tumour volume) and tumour control. Copyright (2001) Australasian College of

  1. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  2. On the Fast and Precise Evaluation of the Outage Probability of Diversity Receivers Over α−μ, κ−μ, and η−μ Fading Channels

    KAUST Repository

    Ben Issaid, Chaouki; Alouini, Mohamed-Slim; Tempone, Raul

    2017-01-01

    In this paper, we are interested in determining the cumulative distribution function of the sum of α−μ, κ−μ, and η−μ random variables in the setting of rare event simulations. To this end, we present a simple and efficient importance sampling approach. The main result of this work is the bounded relative error property of the proposed estimators. Capitalizing on this result, we accurately estimate the outage probability of multibranch maximum ratio combining and equal gain diversity receivers over α−μ, κ−μ, and η−μ fading channels. Selected numerical simulations are discussed to show the robustness of our estimators compared to naive Monte Carlo estimators.

  3. On the Fast and Precise Evaluation of the Outage Probability of Diversity Receivers Over α−μ, κ−μ, and η−μ Fading Channels

    KAUST Repository

    Ben Issaid, Chaouki

    2017-12-04

    In this paper, we are interested in determining the cumulative distribution function of the sum of α−μ, κ−μ, and η−μ random variables in the setting of rare event simulations. To this end, we present a simple and efficient importance sampling approach. The main result of this work is the bounded relative error property of the proposed estimators. Capitalizing on this result, we accurately estimate the outage probability of multibranch maximum ratio combining and equal gain diversity receivers over α−μ, κ−μ, and η−μ fading channels. Selected numerical simulations are discussed to show the robustness of our estimators compared to naive Monte Carlo estimators.

  4. 7 CFR 42.132 - Determining cumulative sum values.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Determining cumulative sum values. 42.132 Section 42... Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's... 3 1 2.5 3 1 2 1 (b) At the beginning of the basic inspection period, the CuSum value is set equal to...

  5. The Algebra of the Cumulative Percent Operation.

    Science.gov (United States)

    Berry, Andrew J.

    2002-01-01

    Discusses how to help students avoid some pervasive reasoning errors in solving cumulative percent problems. Discusses the meaning of ."%+b%." the additive inverse of ."%." and other useful applications. Emphasizes the operational aspect of the cumulative percent concept. (KHR)

  6. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  7. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  8. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  9. Light Scattering of Rough Orthogonal Anisotropic Surfaces with Secondary Most Probable Slope Distributions

    International Nuclear Information System (INIS)

    Li Hai-Xia; Cheng Chuan-Fu

    2011-01-01

    We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution. It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane, which is called the orientation curve. By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface. We derive the equation of the quadratic orientation curve. Experimentally, we construct the system for light scattering measurement using a CCD. The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves. The experimental results conform to the theory. (fundamental areas of phenomenology(including applications))

  10. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  11. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    Energy Technology Data Exchange (ETDEWEB)

    Cieplak, Agnieszka M.; Slosar, Anže, E-mail: acieplak@bnl.gov, E-mail: anze@bnl.gov [Brookhaven National Laboratory, Bldg 510, Upton, NY, 11973 (United States)

    2017-10-01

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n -th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.

  12. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    Science.gov (United States)

    Cieplak, Agnieszka M.; Slosar, Anže

    2017-10-01

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.

  13. The probability distribution of the delay time of a wave packet in strong overlap of resonance levels

    International Nuclear Information System (INIS)

    Lyuboshitz, V.L.

    1982-01-01

    The time development of nuclear reactions at a large density of levels is investigated using the theory of overlapping resonances. The analytical expression for the function describing the time delay probability distribution of a wave packet is obtained in the framework of the model of n equi - valent channels. It is shown that a relative fluctuation of the time delay at the stage of the compound nucleus is snall. The possibility is discussed of increasing the duration of nuclear raactions with rising excitation energy

  14. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  15. Exploration of probability distribution of velocities of saltating sand particles based on the stochastic particle-bed collisions

    International Nuclear Information System (INIS)

    Zheng Xiaojing; Xie Li; Zhou Youhe

    2005-01-01

    The wind-blown sand saltating movement is mainly categorized into two mechanical processes, that is, the interaction between the moving sand particles and the wind in the saltation layer, and the collisions of incident particles with sand bed, and the latter produces a lift-off velocity of a sand particle moving into saltation. In this Letter a methodology of phenomenological analysis is presented to get probability density (distribution) function (pdf) of the lift-off velocity of sand particles from sand bed based on the stochastic particle-bed collision. After the sand particles are dealt with by uniform circular disks and a 2D collision between an incident particle and the granular bed is employed, we get the analytical formulas of lift-off velocity of ejected and rebound particles in saltation, which are functions of some random parameters such as angle and magnitude of incident velocity of the impacting particles, impact and contact angles between the collision particles, and creeping velocity of sand particles, etc. By introducing the probability density functions (pdf's) of these parameters in communion with all possible patterns of sand bed and all possible particle-bed collisions, and using the essential arithmetic of multi-dimension random variables' pdf, the pdf's of lift-off velocities are deduced out and expressed by the pdf's of the random parameters in the collisions. The numerical results of the distributions of lift-off velocities display that they agree well with experimental ones

  16. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  17. Lead in teeth from lead-dosed goats: Microdistribution and relationship to the cumulative lead dose

    International Nuclear Information System (INIS)

    Bellis, David J.; Hetter, Katherine M.; Jones, Joseph; Amarasiriwardena, Dula; Parsons, Patrick J.

    2008-01-01

    Teeth are commonly used as a biomarker of long-term lead exposure. There appear to be few data, however, on the content or distribution of lead in teeth where data on specific lead intake (dose) are also available. This study describes the analysis of a convenience sample of teeth from animals that were dosed with lead for other purposes, i.e., a proficiency testing program for blood lead. Lead concentration of whole teeth obtained from 23 animals, as determined by atomic absorption spectrometry, varied from 0.6 to 80 μg g -1 . Linear regression of whole tooth lead (μg g -1 ) on the cumulative lead dose received by the animal (g) yielded a slope of 1.2, with r 2 =0.647 (p -1 , were found in circumpulpal dentine. Linear regression of circumpulpal lead (μg g -1 ) on cumulative lead dose (g) yielded a slope of 23 with r 2 =0.961 (p=0.0001). The data indicated that whole tooth lead, and especially circumpulpal lead, of dosed goats increased linearly with cumulative lead exposure. These data suggest that circumpulpal dentine is a better biomarker of cumulative lead exposure than is whole tooth lead, at least for lead-dosed goats

  18. Exact valence bond entanglement entropy and probability distribution in the XXX spin chain and the potts model.

    Science.gov (United States)

    Jacobsen, J L; Saleur, H

    2008-02-29

    We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.

  19. Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas

    International Nuclear Information System (INIS)

    Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi

    2016-01-01

    Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)

  20. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina

    2014-07-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework exploits the distance distributions from geometric probability theory to characterize the uplink interference while considering a traditional grid-model set-up for macrocells along with the randomly deployed femtocells. The derived expressions capture the impact of path-loss, composite shadowing and fading, uniform and non-uniform traffic loads, spatial distribution of femtocells, and partial and full spectral reuse among femtocells. Considering dedicated spectrum access, first, we derive the statistics of co-tier interference incurred at both femtocell and macrocell base stations (BSs) from a single interferer by approximating generalized- K composite fading distribution with the tractable Gamma distribution. We then derive the distribution of the number of interferers considering partial spectral reuse and moment generating function (MGF) of the cumulative interference for both partial and full spectral reuse scenarios. Next, we derive the statistics of the cross-tier interference at both femtocell and macrocell BSs considering shared spectrum access. Finally, we utilize the derived expressions to analyze the capacity in both dedicated and shared spectrum access scenarios. The derived expressions are validated by the Monte Carlo simulations. Numerical results are generated to assess the feasibility of shared and dedicated spectrum access in femtocells under varying traffic load and spectral reuse scenarios. © 2014 IEEE.

  1. Cumulative Student Loan Debt in Minnesota, 2015

    Science.gov (United States)

    Williams-Wyche, Shaun

    2016-01-01

    To better understand student debt in Minnesota, the Minnesota Office of Higher Education (the Office) gathers information on cumulative student loan debt from Minnesota degree-granting institutions. These data detail the number of students with loans by institution, the cumulative student loan debt incurred at that institution, and the percentage…

  2. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Estimating cumulative soil accumulation rates with in situ-produced cosmogenic nuclide depth profiles

    International Nuclear Information System (INIS)

    Phillips, William M.

    2000-01-01

    A numerical model relating spatially averaged rates of cumulative soil accumulation and hillslope erosion to cosmogenic nuclide distribution in depth profiles is presented. Model predictions are compared with cosmogenic 21 Ne and AMS radiocarbon data from soils of the Pajarito Plateau, New Mexico. Rates of soil accumulation and hillslope erosion estimated by cosmogenic 21 Ne are significantly lower than rates indicated by radiocarbon and regional soil-geomorphic studies. The low apparent cosmogenic erosion rates are artifacts of high nuclide inheritance in cumulative soil parent material produced from erosion of old soils on hillslopes. In addition, 21 Ne profiles produced under conditions of rapid accumulation (>0.1 cm/a) are difficult to distinguish from bioturbated soil profiles. Modeling indicates that while 10 Be profiles will share this problem, both bioturbation and anomalous inheritance can be identified with measurement of in situ-produced 14 C

  4. Supervised learning of probability distributions by neural networks

    Science.gov (United States)

    Baum, Eric B.; Wilczek, Frank

    1988-01-01

    Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.

  5. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  6. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  7. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    Science.gov (United States)

    Cieplak, Agnieszka; Slosar, Anze

    2018-01-01

    The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.

  8. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  9. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  10. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  11. Cumulative stress and autonomic dysregulation in a community sample.

    Science.gov (United States)

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-05-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.

  12. Cumulative effects of cascade hydropower stations on total dissolved gas supersaturation.

    Science.gov (United States)

    Ma, Qian; Li, Ran; Feng, Jingjie; Lu, Jingying; Zhou, Qin

    2018-03-01

    Elevated levels of total dissolved gas (TDG) may occur downstream of dams during the spill process. These high levels would increase the incidence of gas bubble disease in fish and cause severe environmental impacts. With increasing numbers of cascade hydropower stations being built or planned, the cumulative effects of TDG supersaturation are becoming increasingly prominent. The TDG saturation distribution in the downstream reaches of the Jinsha River was studied to investigate the cumulative effects of TDG supersaturation resulting from the cascade hydropower stations. A comparison of the effects of the joint operation and the single operation of two hydropower stations (XLD and XJB) was performed to analyze the risk degree to fish posed by TDG supersaturation. The results showed that water with supersaturated TDG generated at the upstream cascade can be transported to the downstream power station, leading to cumulative TDG supersaturation effects. Compared with the single operation of XJB, the joint operation of both stations produced a much higher TDG saturation downstream of XJB, especially during the non-flood discharge period. Moreover, the duration of high TDG saturation and the lengths of the lethal and sub-lethal areas were much higher in the joint operation scenario, posing a greater threat to fish and severely damaging the environment. This work provides a scientific basis for strategies to reduce TDG supersaturation to the permissible level and minimize the potential risk of supersaturated TDG.

  13. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  14. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  15. The Visualization of the Space Probability Distribution for a Particle Moving in a Double Ring-Shaped Coulomb Potential

    Directory of Open Access Journals (Sweden)

    Yuan You

    2018-01-01

    Full Text Available The analytical solutions to a double ring-shaped Coulomb potential (RSCP are presented. The visualizations of the space probability distribution (SPD are illustrated for the two- (contour and three-dimensional (isosurface cases. The quantum numbers (n,l,m are mainly relevant for those quasi-quantum numbers (n′,l′,m′ via the double RSCP parameter c. The SPDs are of circular ring shape in spherical coordinates. The properties for the relative probability values (RPVs P are also discussed. For example, when we consider the special case (n,l,m=(6,5,0, the SPD moves towards two poles of z-axis when P increases. Finally, we discuss the different cases for the potential parameter b, which is taken as negative and positive values for c>0. Compared with the particular case b=0, the SPDs are shrunk for b=-0.5, while they are spread out for b=0.5.

  16. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Probability Distribution of Long-run Indiscriminate Felling of Trees in ...

    African Journals Online (AJOL)

    Bright

    conditionally independent of every prior state given the current state (Obodos, ... of events or experiments in which the probability of occurrence for an event ... represent the exhaustive and mutually exclusive outcomes (states) of a system at.

  18. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  19. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  20. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  1. Original and cumulative prospect theory: a discussion of empirical differences

    NARCIS (Netherlands)

    Wakker, P.P.; Fennema, H.

    1997-01-01

    This note discusses differences between prospect theory and cumulative prospect theory. It shows that cumulative prospect theory is not merely a formal correction of some theoretical problems in prospect theory, but it also gives different predictions. Experiments are described that favor cumulative

  2. Path probability of stochastic motion: A functional approach

    Science.gov (United States)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  3. Cumulative radiation dose of multiple trauma patients during their hospitalization

    International Nuclear Information System (INIS)

    Wang Zhikang; Sun Jianzhong; Zhao Zudan

    2012-01-01

    Objective: To study the cumulative radiation dose of multiple trauma patients during their hospitalization and to analyze the dose influence factors. Methods: The DLP for CT and DR were retrospectively collected from the patients during June, 2009 and April, 2011 at a university affiliated hospital. The cumulative radiation doses were calculated by summing typical effective doses of the anatomic regions scanned. Results: The cumulative radiation doses of 113 patients were collected. The maximum,minimum and the mean values of cumulative effective doses were 153.3, 16.48 mSv and (52.3 ± 26.6) mSv. Conclusions: Multiple trauma patients have high cumulative radiation exposure. Therefore, the management of cumulative radiation doses should be enhanced. To establish the individualized radiation exposure archives will be helpful for the clinicians and technicians to make decision whether to image again and how to select the imaging parameters. (authors)

  4. Properties of truncated multiplicity distributions

    International Nuclear Information System (INIS)

    Lupia, S.

    1995-01-01

    Truncation effects on multiplicity distributions are discussed. Observables sensitive to the tail, like factorial moments, factorial cumulants and their ratio, are shown to be strongly affected by truncation. A possible way to overcome this problem by looking at the head of the distribution is suggested. (author)

  5. Properties of truncated multiplicity distributions

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Dipt. di Fisica

    1995-12-31

    Truncation effects on multiplicity distributions are discussed. Observables sensitive to the tail, like factorial moments, factorial cumulants and their ratio, are shown to be strongly affected by truncation. A possible way to overcome this problem by looking at the head of the distribution is suggested. (author)

  6. Perspectives on cumulative risks and impacts.

    Science.gov (United States)

    Faust, John B

    2010-01-01

    Cumulative risks and impacts have taken on different meanings in different regulatory and programmatic contexts at federal and state government levels. Traditional risk assessment methodologies, with considerable limitations, can provide a framework for the evaluation of cumulative risks from chemicals. Under an environmental justice program in California, cumulative impacts are defined to include exposures, public health effects, or environmental effects in a geographic area from the emission or discharge of environmental pollution from all sources, through all media. Furthermore, the evaluation of these effects should take into account sensitive populations and socioeconomic factors where possible and to the extent data are available. Key aspects to this potential approach include the consideration of exposures (versus risk), socioeconomic factors, the geographic or community-level assessment scale, and the inclusion of not only health effects but also environmental effects as contributors to impact. Assessments of this type extend the boundaries of the types of information that toxicologists generally provide for risk management decisions.

  7. Cumulative particle production in the quark recombination model

    International Nuclear Information System (INIS)

    Gavrilov, V.B.; Leksin, G.A.

    1987-01-01

    Production of cumulative particles in hadron-nuclear inteactions at high energies is considered within the framework of recombination quark model. Predictions for inclusive cross sections of production of cumulative particles and different resonances containing quarks in s state are made

  8. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    Science.gov (United States)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  9. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Francisco, Alexandre S.; Duran, Jorge Alberto R., E-mail: afrancisco@metal.eeimvr.uff.br, E-mail: duran@metal.eeimvr.uff.br [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Dept. de Engenharia Mecanica

    2013-07-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  10. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    International Nuclear Information System (INIS)

    Francisco, Alexandre S.; Duran, Jorge Alberto R.

    2013-01-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  11. Mechanisms and risk of cumulative impacts to coastal ecosystem services: An expert elicitation approach.

    Science.gov (United States)

    Singh, Gerald G; Sinner, Jim; Ellis, Joanne; Kandlikar, Milind; Halpern, Benjamin S; Satterfield, Terre; Chan, Kai M A

    2017-09-01

    Coastal environments are some of the most populated on Earth, with greater pressures projected in the future. Managing coastal systems requires the consideration of multiple uses, which both benefit from and threaten multiple ecosystem services. Thus understanding the cumulative impacts of human activities on coastal ecosystem services would seem fundamental to management, yet there is no widely accepted approach for assessing these. This study trials an approach for understanding the cumulative impacts of anthropogenic change, focusing on Tasman and Golden Bays, New Zealand. Using an expert elicitation procedure, we collected information on three aspects of cumulative impacts: the importance and magnitude of impacts by various activities and stressors on ecosystem services, and the causal processes of impact on ecosystem services. We assessed impacts to four ecosystem service benefits - fisheries, shellfish aquaculture, marine recreation and existence value of biodiversity-addressing three main research questions: (1) how severe are cumulative impacts on ecosystem services (correspondingly, what potential is there for restoration)?; (2) are threats evenly distributed across activities and stressors, or do a few threats dominate?; (3) do prominent activities mainly operate through direct stressors, or do they often exacerbate other impacts? We found (1) that despite high uncertainty in the threat posed by individual stressors and impacts, total cumulative impact is consistently severe for all four ecosystem services. (2) A subset of drivers and stressors pose important threats across the ecosystem services explored, including climate change, commercial fishing, sedimentation and pollution. (3) Climate change and commercial fishing contribute to prominent indirect impacts across ecosystem services by exacerbating regional impacts, namely sedimentation and pollution. The prevalence and magnitude of these indirect, networked impacts highlights the need for approaches

  12. Mechanisms and risk of cumulative impacts to coastal ecosystem services: An expert elicitation approach

    KAUST Repository

    Singh, Gerald G.

    2017-05-23

    Coastal environments are some of the most populated on Earth, with greater pressures projected in the future. Managing coastal systems requires the consideration of multiple uses, which both benefit from and threaten multiple ecosystem services. Thus understanding the cumulative impacts of human activities on coastal ecosystem services would seem fundamental to management, yet there is no widely accepted approach for assessing these. This study trials an approach for understanding the cumulative impacts of anthropogenic change, focusing on Tasman and Golden Bays, New Zealand. Using an expert elicitation procedure, we collected information on three aspects of cumulative impacts: the importance and magnitude of impacts by various activities and stressors on ecosystem services, and the causal processes of impact on ecosystem services. We assessed impacts to four ecosystem service benefits — fisheries, shellfish aquaculture, marine recreation and existence value of biodiversity—addressing three main research questions: (1) how severe are cumulative impacts on ecosystem services (correspondingly, what potential is there for restoration)?; (2) are threats evenly distributed across activities and stressors, or do a few threats dominate?; (3) do prominent activities mainly operate through direct stressors, or do they often exacerbate other impacts? We found (1) that despite high uncertainty in the threat posed by individual stressors and impacts, total cumulative impact is consistently severe for all four ecosystem services. (2) A subset of drivers and stressors pose important threats across the ecosystem services explored, including climate change, commercial fishing, sedimentation and pollution. (3) Climate change and commercial fishing contribute to prominent indirect impacts across ecosystem services by exacerbating regional impacts, namely sedimentation and pollution. The prevalence and magnitude of these indirect, networked impacts highlights the need for

  13. Cumulative radiation exposure in children with cystic fibrosis.

    LENUS (Irish Health Repository)

    O'Reilly, R

    2010-02-01

    This retrospective study calculated the cumulative radiation dose for children with cystic fibrosis (CF) attending a tertiary CF centre. Information on 77 children with a mean age of 9.5 years, a follow up time of 658 person years and 1757 studies including 1485 chest radiographs, 215 abdominal radiographs and 57 computed tomography (CT) scans, of which 51 were thoracic CT scans, were analysed. The average cumulative radiation dose was 6.2 (0.04-25) mSv per CF patient. Cumulative radiation dose increased with increasing age and number of CT scans and was greater in children who presented with meconium ileus. No correlation was identified between cumulative radiation dose and either lung function or patient microbiology cultures. Radiation carries a risk of malignancy and children are particularly susceptible. Every effort must be made to avoid unnecessary radiation exposure in these patients whose life expectancy is increasing.

  14. Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions

    International Nuclear Information System (INIS)

    Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.

    2010-01-01

    A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.

  15. Streamflow distribution maps for the Cannon River drainage basin, southeast Minnesota, and the St. Louis River drainage basin, northeast Minnesota

    Science.gov (United States)

    Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.

    2017-12-27

    Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface

  16. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  17. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  18. On misclassication probabilities of linear and quadratic classiers ...

    African Journals Online (AJOL)

    We study the theoretical misclassication probability of linear and quadratic classiers and examine the performance of these classiers under distributional variations in theory and using simulation. We derive expression for Bayes errors for some competing distributions from the same family under location shift. Keywords: ...

  19. Optimal execution with price impact under Cumulative Prospect Theory

    Science.gov (United States)

    Zhao, Jingdong; Zhu, Hongliang; Li, Xindan

    2018-01-01

    Optimal execution of a stock (or portfolio) has been widely studied in academia and in practice over the past decade, and minimizing transaction costs is a critical point. However, few researchers consider the psychological factors for the traders. What are traders truly concerned with - buying low in the paper accounts or buying lower compared to others? We consider the optimal trading strategies in terms of the price impact and Cumulative Prospect Theory and identify some specific properties. Our analyses indicate that a large proportion of the execution volume is distributed at both ends of the transaction time. But the trader's optimal strategies may not be implemented at the same transaction size and speed in different market environments.

  20. Measurement of four-particle cumulants and symmetric cumulants with subevent methods in small collision systems with the ATLAS detector

    CERN Document Server

    Derendarz, Dominik; The ATLAS collaboration

    2018-01-01

    Measurements of symmetric cumulants SC(n,m)=⟨v2nv2m⟩−⟨v2n⟩⟨v2m⟩ for (n,m)=(2,3) and (2,4) and asymmetric cumulant AC(n) are presented in pp, p+Pb and peripheral Pb+Pb collisions at various collision energies, aiming to probe the long-range collective nature of multi-particle production in small systems. Results are obtained using the standard cumulant method, as well as the two-subevent and three-subevent cumulant methods. Results from the standard method are found to be strongly biased by non-flow correlations as indicated by strong sensitivity to the chosen event class definition. A systematic reduction of non-flow effects is observed when using the two-subevent method and the results become independent of event class definition when the three-subevent method is used. The measured SC(n,m) shows an anti-correlation between v2 and v3, and a positive correlation between v2 and v4. The magnitude of SC(n,m) is constant with Nch in pp collisions, but increases with Nch in p+Pb and Pb+Pb collisions. ...