Pattern formation, logistics, and maximum path probability
Kirkaldy, J. S.
1985-05-01
The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are
Maximum Entropy and Probability Kinematics Constrained by Conditionals
Stefan Lukits
2015-03-01
Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.
Maximum-entropy probability distributions under Lp-norm constraints
Dolinar, S.
1991-01-01
Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.
A probabilistic approach to the concept of Probable Maximum Precipitation
Papalexiou, S. M.; D. Koutsoyiannis
2006-01-01
International audience; The concept of Probable Maximum Precipitation (PMP) is based on the assumptions that (a) there exists an upper physical limit of the precipitation depth over a given area at a particular geographical location at a certain time of year, and (b) that this limit can be estimated based on deterministic considerations. The most representative and widespread estimation method of PMP is the so-called moisture maximization method. This method maximizes observed storms assuming...
Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains
Erik Van der Straeten
2009-11-01
Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.
An Integrated Modeling Framework for Probable Maximum Precipitation and Flood
Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.
2015-12-01
With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.
Needs to Update Probable Maximum Precipitation for Critical Infrastructure
Pathak, C. S.; England, J. F.
2015-12-01
Probable Maximum Precipitation (PMP) is theoretically the greatest depth of precipitation for a given duration that is physically possible over a given size storm area at a particular geographical location at a certain time of the year. It is used to develop inflow flood hydrographs, known as Probable Maximum Flood (PMF), as design standard for high-risk flood-hazard structures, such as dams and nuclear power plants. PMP estimation methodology was developed in the 1930s and 40s when many dams were constructed in the US. The procedures to estimate PMP were later standardized by the World Meteorological Organization (WMO) in 1973 and revised in 1986.In the US, PMP estimates were published in a series of Hydrometeorological Reports (e.g., HMR55A, HMR57, and HMR58/59) by the National Weather Service since 1950s. In these reports, storm data up to 1980s were used to establish the current PMP estimates. Since that time, we have acquired additional meteorological data for 30 to 40 years, including newly available radar and satellite based precipitation data. These data sets are expected to have improved data quality and availability in both time and space. In addition, significant numbers of extreme storms have occurred and selected numbers of these events were even close to or exceeding the current PMP estimates, in some cases. In the last 50 years, climate science has progressed and scientists have better and improved understanding of atmospheric physics of extreme storms. However, applied research in estimation of PMP has been lagging behind. Alternative methods, such as atmospheric numerical modeling, should be investigated for estimating PMP and associated uncertainties. It would be highly desirable if regional atmospheric numerical models could be utilized in the estimation of PMP and their uncertainties, in addition to methods used to originally develop PMP index maps in the existing hydrometeorological reports.
Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation
Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin
2016-12-01
If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.
Probable Maximum Earthquake Magnitudes for the Cascadia Subduction
Rong, Y.; Jackson, D. D.; Magistrale, H.; Goldfinger, C.
2013-12-01
The concept of maximum earthquake magnitude (mx) is widely used in seismic hazard and risk analysis. However, absolute mx lacks a precise definition and cannot be determined from a finite earthquake history. The surprising magnitudes of the 2004 Sumatra and the 2011 Tohoku earthquakes showed that most methods for estimating mx underestimate the true maximum if it exists. Thus, we introduced the alternate concept of mp(T), probable maximum magnitude within a time interval T. The mp(T) can be solved using theoretical magnitude-frequency distributions such as Tapered Gutenberg-Richter (TGR) distribution. The two TGR parameters, β-value (which equals 2/3 b-value in the GR distribution) and corner magnitude (mc), can be obtained by applying maximum likelihood method to earthquake catalogs with additional constraint from tectonic moment rate. Here, we integrate the paleoseismic data in the Cascadia subduction zone to estimate mp. The Cascadia subduction zone has been seismically quiescent since at least 1900. Fortunately, turbidite studies have unearthed a 10,000 year record of great earthquakes along the subduction zone. We thoroughly investigate the earthquake magnitude-frequency distribution of the region by combining instrumental and paleoseismic data, and using the tectonic moment rate information. To use the paleoseismic data, we first estimate event magnitudes, which we achieve by using the time interval between events, rupture extent of the events, and turbidite thickness. We estimate three sets of TGR parameters: for the first two sets, we consider a geographically large Cascadia region that includes the subduction zone, and the Explorer, Juan de Fuca, and Gorda plates; for the third set, we consider a narrow geographic region straddling the subduction zone. In the first set, the β-value is derived using the GCMT catalog. In the second and third sets, the β-value is derived using both the GCMT and paleoseismic data. Next, we calculate the corresponding mc
Paddle River Dam : review of probable maximum flood
Clark, D. [UMA Engineering Ltd., Edmonton, AB (Canada); Neill, C.R. [Northwest Hydraulic Consultants Ltd., Edmonton, AB (Canada)
2008-07-01
The Paddle River Dam was built in northern Alberta in the mid 1980s for flood control. According to the 1999 Canadian Dam Association (CDA) guidelines, this 35 metre high, zoned earthfill dam with a spillway capacity sized to accommodate a probable maximum flood (PMF) is rated as a very high hazard. At the time of design, it was estimated to have a peak flow rate of 858 centimetres. A review of the PMF in 2002 increased the peak flow rate to 1,890 centimetres. In light of a 2007 revision of the CDA safety guidelines, the PMF was reviewed and the inflow design flood (IDF) was re-evaluated. This paper discussed the levels of uncertainty inherent in PMF determinations and some difficulties encountered with the SSARR hydrologic model and the HEC-RAS hydraulic model in unsteady mode. The paper also presented and discussed the analysis used to determine incremental damages, upon which a new IDF of 840 m{sup 3}/s was recommended. The paper discussed the PMF review, modelling methodology, hydrograph inputs, and incremental damage of floods. It was concluded that the PMF review, involving hydraulic routing through the valley bottom together with reconsideration of the previous runoff modeling provides evidence that the peak reservoir inflow could reasonably be reduced by approximately 20 per cent. 8 refs., 5 tabs., 8 figs.
Understanding the Role of Reservoir Size on Probable Maximum Precipitation
Woldemichael, A. T.; Hossain, F.
2011-12-01
This study addresses the question 'Does surface area of an artificial reservoir matter in the estimation of probable maximum precipitation (PMP) for an impounded basin?' The motivation of the study was based on the notion that the stationarity assumption that is implicit in the PMP for dam design can be undermined in the post-dam era due to an enhancement of extreme precipitation patterns by an artificial reservoir. In addition, the study lays the foundation for use of regional atmospheric models as one way to perform life cycle assessment for planned or existing dams to formulate best management practices. The American River Watershed (ARW) with the Folsom dam at the confluence of the American River was selected as the study region and the Dec-Jan 1996-97 storm event was selected for the study period. The numerical atmospheric model used for the study was the Regional Atmospheric Modeling System (RAMS). First, the numerical modeling system, RAMS, was calibrated and validated with selected station and spatially interpolated precipitation data. Best combinations of parameterization schemes in RAMS were accordingly selected. Second, to mimic the standard method of PMP estimation by moisture maximization technique, relative humidity terms in the model were raised to 100% from ground up to the 500mb level. The obtained model-based maximum 72-hr precipitation values were named extreme precipitation (EP) as a distinction from the PMPs obtained by the standard methods. Third, six hypothetical reservoir size scenarios ranging from no-dam (all-dry) to the reservoir submerging half of basin were established to test the influence of reservoir size variation on EP. For the case of the ARW, our study clearly demonstrated that the assumption of stationarity that is implicit the traditional estimation of PMP can be rendered invalid to a large part due to the very presence of the artificial reservoir. Cloud tracking procedures performed on the basin also give indication of the
National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...
Site Specific Probable Maximum Precipitation Estimates and Professional Judgement
Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.
2015-12-01
State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially
Testing for the maximum cell probabilities in multinomial distributions
XIONG Shifeng; LI Guoying
2005-01-01
This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.
Evaluation for Success Probability of Chaff Centroid Jamming
GAO Dong-hua; SHI Xiu-hua
2008-01-01
As the chaff centroid jamming can introduce the guiding error of the anti-warship missile's seeker and decrease its hitting probability, a new quantitative analysis method and a mathematic model are proposed in this paper to evaluate the success jamming probability. By using this method, the optimal decision scheme of chaff centroid jamming in different threat situations can be found, and also the success probability of this scheme can be calculated quantitatively. Thus, the operation rules of the centroid jamming and the tactical approach for increasing the success probability can be determined.
Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.
Richard R Stein
2015-07-01
Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.
Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.
Stein, Richard R; Marks, Debora S; Sander, Chris
2015-07-01
Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.
On Field Size and Success Probability in Network Coding
Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper
2008-01-01
Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...
Cluster state preparation using gates operating at arbitrary success probabilities
Kieling, K.; Gross, D.; Eisert, J.
2007-06-01
Several physical architectures allow for measurement-based quantum computing using sequential preparation of cluster states by means of probabilistic quantum gates. In such an approach, the order in which partial resources are combined to form the final cluster state turns out to be crucially important. We determine the influence of this classical decision process on the expected size of the final cluster. Extending earlier work, we consider different quantum gates operating at various probabilites of success. For finite resources, we employ a computer algebra system to obtain the provably optimal classical control strategy and derive symbolic results for the expected final size of the cluster. We identify two regimes: when the success probability of the elementary gates is high, the influence of the classical control strategy is found to be negligible. In that case, other figures of merit become more relevant. In contrast, for small probabilities of success, the choice of an appropriate strategy is crucial.
Unification of Field Theory and Maximum Entropy Methods for Learning Probability Densities
Kinney, Justin B
2014-01-01
Bayesian field theory and maximum entropy are two methods for learning smooth probability distributions (a.k.a. probability densities) from finite sampled data. Both methods were inspired by statistical physics, but the relationship between them has remained unclear. Here I show that Bayesian field theory subsumes maximum entropy density estimation. In particular, the most common maximum entropy methods are shown to be limiting cases of Bayesian inference using field theory priors that impose no boundary conditions on candidate densities. This unification provides a natural way to test the validity of the maximum entropy assumption on one's data. It also provides a better-fitting nonparametric density estimate when the maximum entropy assumption is rejected.
Willink, R.
2010-06-01
The method of uncertainty evaluation discussed in Supplement 1 to the Guide to the Expression of Uncertainty in Measurement generates a coverage interval in which the measurand is said to have a certain probability (the coverage probability) of lying. This communication contains a response to the recent claim that 'when a coverage interval summarizes the resulting state of knowledge, the coverage probability should not be interpreted as a relative frequency of successful intervals in a large series of imagined or simulated intervals' (Lira 2009 Metrologia 46 616-8). First, Bernoulli's law of large numbers is used to prove that the long-run success rate of a methodology used to calculate 95% coverage intervals must be 95%. Second, the usual definition of subjective probability or 'degree of belief' is stated, and the weak law of large numbers is then used to show that this definition—and the corresponding definition of 'state of knowledge'—relies on the concept of long-run behaviour. This provides an alternative proof of the same result.
Maximum probability domains for the analysis of the microscopic structure of liquids
Agostini, Federica; Savin, Andreas; Vuilleumier, Rodolphe
2014-01-01
We introduce the concept of maximum probability domains, developed in the context of the analysis of electronic densities, in the study of the microscopic spatial structures of liquids. The idea of locating a particle in a three dimensional region, by determining the domain where the probability of finding that, and only that, particle is maximum, gives an interesting characterisation of the local structure of the liquid. The optimisation procedure, required for the search of the domain of maximum probability, is carried out by the implementation of the level set method. Some results for few case studies are presented. In particular by looking at liquid water at different densities or at the solvation shells of Na$^+$ always in liquid water.
Bayesian probability of success for clinical trials using historical data.
Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F
2015-01-30
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials.
Sabarish, R. Mani; Narasimhan, R.; Chandhru, A. R.; Suribabu, C. R.; Sudharsan, J.; Nithiyanantham, S.
2017-05-01
In the design of irrigation and other hydraulic structures, evaluating the magnitude of extreme rainfall for a specific probability of occurrence is of much importance. The capacity of such structures is usually designed to cater to the probability of occurrence of extreme rainfall during its lifetime. In this study, an extreme value analysis of rainfall for Tiruchirapalli City in Tamil Nadu was carried out using 100 years of rainfall data. Statistical methods were used in the analysis. The best-fit probability distribution was evaluated for 1, 2, 3, 4 and 5 days of continuous maximum rainfall. The goodness of fit was evaluated using Chi-square test. The results of the goodness-of-fit tests indicate that log-Pearson type III method is the overall best-fit probability distribution for 1-day maximum rainfall and consecutive 2-, 3-, 4-, 5- and 6-day maximum rainfall series of Tiruchirapalli. To be reliable, the forecasted maximum rainfalls for the selected return periods are evaluated in comparison with the results of the plotting position.
Takara, K. T.
2015-12-01
This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.
Estimation of the probability of success in petroleum exploration
Davis, J.C.
1977-01-01
A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Marco Bee
2012-01-01
This paper deals with the estimation of the lognormal-Pareto and the lognormal-Generalized Pareto mixture distributions. The log-likelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal- Generalized Pareto case, we work ou...
Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick
2016-06-01
Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation
Felder, Guido; Zischg, Andreas; Weingartner, Rolf
2017-07-01
Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.
Suligowski, Roman
2014-05-01
Probable Maximum Precipitation based upon the physical mechanisms of precipitation formation at the Kielce Upland. This estimation stems from meteorological analysis of extremely high precipitation events, which occurred in the area between 1961 and 2007 causing serious flooding from rivers that drain the entire Kielce Upland. Meteorological situation has been assessed drawing on the synoptic maps, baric topography charts, satellite and radar images as well as the results of meteorological observations derived from surface weather observation stations. Most significant elements of this research include the comparison between distinctive synoptic situations over Europe and subsequent determination of typical rainfall generating mechanism. This allows the author to identify the source areas of air masses responsible for extremely high precipitation at the Kielce Upland. Analysis of the meteorological situations showed, that the source areas for humid air masses which cause the largest rainfalls at the Kielce Upland are the area of northern Adriatic Sea and the north-eastern coast of the Black Sea. Flood hazard at the Kielce Upland catchments was triggered by daily precipitation of over 60 mm. The highest representative dew point temperature in source areas of warm air masses (these responsible for high precipitation at the Kielce Upland) exceeded 20 degrees Celsius with a maximum of 24.9 degrees Celsius while precipitable water amounted to 80 mm. The value of precipitable water is also used for computation of factors featuring the system, namely the mass transformation factor and the system effectiveness factor. The mass transformation factor is computed based on precipitable water in the feeding mass and precipitable water in the source area. The system effectiveness factor (as the indicator of the maximum inflow velocity and the maximum velocity in the zone of front or ascending currents, forced by orography) is computed from the quotient of precipitable water in
Lukeš, Tomáš; Křížek, Pavel; Švindrych, Zdeněk; Benda, Jakub; Ovesný, Martin; Fliegel, Karel; Klíma, Miloš; Hagen, Guy M
2014-12-01
We introduce and demonstrate a new high performance image reconstruction method for super-resolution structured illumination microscopy based on maximum a posteriori probability estimation (MAP-SIM). Imaging performance is demonstrated on a variety of fluorescent samples of different thickness, labeling density and noise levels. The method provides good suppression of out of focus light, improves spatial resolution, and allows reconstruction of both 2D and 3D images of cells even in the case of weak signals. The method can be used to process both optical sectioning and super-resolution structured illumination microscopy data to create high quality super-resolution images.
Bai, Xian-Zong; Ma, Chao-Wei; Chen, Lei; Tang, Guo-Jin
2016-09-01
When engaging in the maximum collision probability (Pcmax) analysis for short-term conjunctions between two orbiting objects, it is important to clarify and understand the assumptions for obtaining Pcmax. Based on Chan's analytical formulae and analysis of covariance ellipse's variation of orientation, shape, and size in the two-dimensional conjunction plane, this paper proposes a clear and comprehensive analysis of maximum collision probability when considering these variables. Eight situations will be considered when calculating Pcmax according to the varied orientation, shape, and size of the covariance ellipse. Three of the situations are not practical or meaningful; the remaining ones were completely or partially discussed in some of the previous works. These situations are discussed with uniform definitions and symbols and they are derived independently in this paper. The consequences are compared and validated by the results from previous works. Finally, a practical conjunction event is presented as a test case to demonstrate the effectiveness of methodology. Comparison of the Pcmax presented in this paper with the empirical results from the curve or surface calculated by numerical method indicates that the relative error of Pcmax is less than 0.0039%.
Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan
2013-01-01
This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.
Probable maximum precipitation 24 hours estimation: A case study of Zanjan province of Iran
Azim Shirdeli
2012-10-01
Full Text Available One of the primary concerns in designing civil structures such as water storage dams and irrigation and drainage networks is to find economic scale based on possibility of natural incidents such as floods, earthquake, etc. Probable maximum precipitation (PMP is one of well known methods, which helps design a civil structure, properly. In this paper, we study the maximum one-day precipitation using 17 to 50 years of information in 13 stations located in province of Zanjan, Iran. The proposed study of this paper uses two Hershfield methods, where the first one yields 18.17 to 18.48 for precipitation where the PMP24 was between 170.14 mm and 255.28 mm. The second method reports precipitation between 2.29 and 4.95 while PMP24 was between 62.33 mm and 92.08 mm. In addition, when the out of range data were deleted from the study of the second method, precipitation rates were calculated between 2.29 and 4.31 while PMP24 was between 76.08 mm and 117.28 mm. The preliminary results indicate that the second Hershfield method provide more stable results than the first one.
A New Maximum Entropy Probability Function for the Surface Elevation of Nonlinear Sea Waves
ZHANG Li-zhen; XU De-lun
2005-01-01
Based on the maximum entropy principle a new probability density function (PDF) f(x) for the surface elevation of nonlinear sea waves, X, is derived through performing a coordinate transform of X and solving a variation problem subject to three constraint conditions of f(x). Compared with the maximum entropy PDFs presented previously, the new PDF has the following merits: (1) it has four parameters to be determined and hence can give more refined fit to observed data and has wider suitability for nonlinear waves in different conditions; (2) these parameters are expressed in terms of distribution moments of X in a relatively simple form and hence are easy to be determined from observed data; (3) the PDF is free of the restriction of weak nonlinearity and possible to be used for sea waves in complicated conditions, such as those in shallow waters with complicated topography; and (4) the PDF is simple in form and hence convenient for theoretical and practical uses. Laboratory wind-wave experiments have been conducted to test the competence of the new PDF for the surface elevation of nonlinear waves. The experimental results manifest that the new PDF gives somewhat better fit to the laboratory wind-wave data than the well-known Gram-Charlier PDF and beta PDF.
Huang, Y X; Zhou, Q; Qiu, X; Shang, X D; Lu, Z M; Liu, and Y L
2014-01-01
In this paper, we introduce a new way to estimate the scaling parameter of a self-similar process by considering the maximum probability density function (pdf) of tis increments. We prove this for $H$-self-similar processes in general and experimentally investigate it for turbulent velocity and temperature increments. We consider turbulent velocity database from an experimental homogeneous and nearly isotropic turbulent channel flow, and temperature data set obtained near the sidewall of a Rayleigh-B\\'{e}nard convection cell, where the turbulent flow is driven by buoyancy. For the former database, it is found that the maximum value of increment pdf $p_{\\max}(\\tau)$ is in a good agreement with lognormal distribution. We also obtain a scaling exponent $\\alpha\\simeq 0.37$, which is consistent with the scaling exponent for the first-order structure function reported in other studies. For the latter one, we obtain a scaling exponent $\\alpha_{\\theta}\\simeq0.33$. This index value is consistent with the Kolmogorov-Ob...
Sidek, L. M.; Mohd Nor, M. D.; Rakhecha, P. R.; Basri, H.; Jayothisa, W.; Muda, R. S.; Ahmad, M. N.; Razad, A. Z. Abdul
2013-06-01
The Cameron Highland Batang Padang (CHBP) catchment situated on the main mountain range of Peninsular Malaysia is of large economical importance where currently a series of three dams (Sultan Abu Bakar, Jor and Mahang) exist in the development of water resources and hydropower. The prediction of the design storm rainfall values for different return periods including PMP values can be useful to review the adequacy of the current spillway capacities of these dams. In this paper estimates of the design storm rainfalls for various return periods and also the PMP values for rainfall stations in the CHBP catchment have been computed for the three different durations of 1, 3 & 5 days. The maximum values for 1 day, 3 days and 5 days PMP values are found to be 730.08mm, 966.17mm and 969.0mm respectively at Station number 4513033 Gunung Brinchang. The PMP values obtained were compared with previous study results undertaken by NAHRIM. However, the highest ratio of 1 day, 3 day and 5 day PMP to highest observed rainfall are found to be 2.30, 1.94 and 1.82 respectively. This shows that the ratio tend to decrease as the duration increase. Finally, the temporal pattern for 1 day, 3day and 5 days have been developed based on observed extreme rainfall at station 4513033 Gunung Brinchang for the generation of Probable Maximum Flood (PMF) in dam break analysis.
Estimation of the Probable Maximum Flood for a Small Lowland River in Poland
Banasik, K.; Hejduk, L.
2009-04-01
The planning, designe and use of hydrotechnical structures often requires the assesment of maximu flood potentials. The most common term applied to this upper limit of flooding is the probable maximum flood (PMF). The PMP/UH (probable maximum precipitation/unit hydrograph) method has been used in the study to predict PMF from a small agricultural lowland river basin of Zagozdzonka (left tributary of Vistula river) in Poland. The river basin, located about 100 km south of Warsaw, with an area - upstream the gauge of Plachty - of 82 km2, has been investigated by Department of Water Engineering and Environmenal Restoration of Warsaw University of Life Sciences - SGGW since 1962. Over 40-year flow record was used in previous investigation for predicting T-year flood discharge (Banasik et al., 2003). The objective here was to estimate the PMF using the PMP/UH method and to compare the results with the 100-year flood. A new relation of depth-duration curve of PMP for the local climatic condition has been developed based on Polish maximum observed rainfall data (Ozga-Zielinska & Ozga-Zielinski, 2003). Exponential formula, with the value of exponent of 0.47, i.e. close to the exponent in formula for world PMP and also in the formula of PMP for Great Britain (Wilson, 1993), gives the rainfall depth about 40% lower than the Wilson's one. The effective rainfall (runoff volume) has been estimated from the PMP of various duration using the CN-method (USDA-SCS, 1986). The CN value as well as parameters of the IUH model (Nash, 1957) have been established from the 27 rainfall-runoff events, recorded in the river basin in the period 1980-2004. Varibility of the parameter values with the size of the events will be discussed in the paper. The results of the analyse have shown that the peak discharge of the PMF is 4.5 times larger then 100-year flood, and volume ratio of the respective direct hydrographs caused by rainfall events of critical duration is 4.0. References 1.Banasik K
D. L. Bricker
1997-01-01
Full Text Available The problem of assigning cell probabilities to maximize a multinomial likelihood with order restrictions on the probabilies and/or restrictions on the local odds ratios is modeled as a posynomial geometric program (GP, a class of nonlinear optimization problems with a well-developed duality theory and collection of algorithms. (Local odds ratios provide a measure of association between categorical random variables. A constrained multinomial MLE example from the literature is solved, and the quality of the solution is compared with that obtained by the iterative method of El Barmi and Dykstra, which is based upon Fenchel duality. Exploiting the proximity of the GP model of MLE problems to linear programming (LP problems, we also describe as an alternative, in the absence of special-purpose GP software, an easily implemented successive LP approximation method for solving this class of MLE problems using one of the readily available LP solvers.
Jayajit Das '
2015-07-01
Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.
Evaluation of Probable Maximum Precipitation and Flood under Climate Change in the 21st Century
Gangrade, S.; Kao, S. C.; Rastogi, D.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.
2016-12-01
Critical infrastructures are potentially vulnerable to extreme hydro-climatic events. Under a warming environment, the magnitude and frequency of extreme precipitation and flood are likely to increase enhancing the needs to more accurately quantify the risks due to climate change. In this study, we utilized an integrated modeling framework that includes the Weather Research Forecasting (WRF) model and a high resolution distributed hydrology soil vegetation model (DHSVM) to simulate probable maximum precipitation (PMP) and flood (PMF) events over Alabama-Coosa-Tallapoosa River Basin. A total of 120 storms were selected to simulate moisture maximized PMP under different meteorological forcings, including historical storms driven by Climate Forecast System Reanalysis (CFSR) and baseline (1981-2010), near term future (2021-2050) and long term future (2071-2100) storms driven by Community Climate System Model version 4 (CCSM4) under Representative Concentrations Pathway 8.5 emission scenario. We also analyzed the sensitivity of PMF to various antecedent hydrologic conditions such as initial soil moisture conditions and tested different compulsive approaches. Overall, a statistical significant increase is projected for future PMP and PMF, mainly attributed to the increase of background air temperature. The ensemble of simulated PMP and PMF along with their sensitivity allows us to better quantify the potential risks associated with hydro-climatic extreme events on critical energy-water infrastructures such as major hydropower dams and nuclear power plants.
Wenchao Cui
2013-01-01
Full Text Available This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP and Bayes’ rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.
Lussana, C.
2013-04-01
The presented work focuses on the investigation of gridded daily minimum (TN) and maximum (TX) temperature probability density functions (PDFs) with the intent of both characterising a region and detecting extreme values. The empirical PDFs estimation procedure has been realised using the most recent years of gridded temperature analysis fields available at ARPA Lombardia, in Northern Italy. The spatial interpolation is based on an implementation of Optimal Interpolation using observations from a dense surface network of automated weather stations. An effort has been made to identify both the time period and the spatial areas with a stable data density otherwise the elaboration could be influenced by the unsettled station distribution. The PDF used in this study is based on the Gaussian distribution, nevertheless it is designed to have an asymmetrical (skewed) shape in order to enable distinction between warming and cooling events. Once properly defined the occurrence of extreme events, it is possible to straightforwardly deliver to the users the information on a local-scale in a concise way, such as: TX extremely cold/hot or TN extremely cold/hot.
Gogarten J Peter
2002-02-01
Full Text Available Abstract Background Horizontal gene transfer (HGT played an important role in shaping microbial genomes. In addition to genes under sporadic selection, HGT also affects housekeeping genes and those involved in information processing, even ribosomal RNA encoding genes. Here we describe tools that provide an assessment and graphic illustration of the mosaic nature of microbial genomes. Results We adapted the Maximum Likelihood (ML mapping to the analyses of all detected quartets of orthologous genes found in four genomes. We have automated the assembly and analyses of these quartets of orthologs given the selection of four genomes. We compared the ML-mapping approach to more rigorous Bayesian probability and Bootstrap mapping techniques. The latter two approaches appear to be more conservative than the ML-mapping approach, but qualitatively all three approaches give equivalent results. All three tools were tested on mitochondrial genomes, which presumably were inherited as a single linkage group. Conclusions In some instances of interphylum relationships we find nearly equal numbers of quartets strongly supporting the three possible topologies. In contrast, our analyses of genome quartets containing the cyanobacterium Synechocystis sp. indicate that a large part of the cyanobacterial genome is related to that of low GC Gram positives. Other groups that had been suggested as sister groups to the cyanobacteria contain many fewer genes that group with the Synechocystis orthologs. Interdomain comparisons of genome quartets containing the archaeon Halobacterium sp. revealed that Halobacterium sp. shares more genes with Bacteria that live in the same environment than with Bacteria that are more closely related based on rRNA phylogeny . Many of these genes encode proteins involved in substrate transport and metabolism and in information storage and processing. The performed analyses demonstrate that relationships among prokaryotes cannot be accurately
Some uses of predictive probability of success in clinical drug development
Mauro Gasparini
2013-03-01
Full Text Available Predictive probability of success is a (subjective Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations imposed in the world of pharmaceutical development.Within a single trial, predictive probability of success can be identified with expected power, i.e. the evaluation of the success probability of the trial. Success means, for example, obtaining a significant result of a standard superiority test.Across trials, predictive probability of success can be the probability of a successful completion of an entire part of clinical development, for example a successful phase III development in the presence of phase II data.Calculations of predictive probability of success in the presence of normal data with known variance will be illustrated, both for within-trial and across-trial predictions.
Malhis, Nawar; Butterfield, Yaron S N; Ester, Martin; Jones, Steven J M
2009-01-01
A plethora of alignment tools have been created that are designed to best fit different types of alignment conditions. While some of these are made for aligning Illumina Sequence Analyzer reads, none of these are fully utilizing its probability (prb) output. In this article, we will introduce a new alignment approach (Slider) that reduces the alignment problem space by utilizing each read base's probabilities given in the prb files. Compared with other aligners, Slider has higher alignment accuracy and efficiency. In addition, given that Slider matches bases with probabilities other than the most probable, it significantly reduces the percentage of base mismatches. The result is that its SNP predictions are more accurate than other SNP prediction approaches used today that start from the most probable sequence, including those using base quality.
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
Probability of success for phase III after exploratory biomarker analysis in phase II.
Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver
2017-02-23
The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs.
Rius, Jordi
2006-09-01
The maximum-likelihood method is applied to direct methods to derive a more general probability density function of the triple-phase sums which is capable of predicting negative values. This study also proves that maximization of the origin-free modulus sum function S yields, within the limitations imposed by the assumed approximations, the maximum-likelihood estimates of the phases. It thus represents the formal theoretical justification of the S function that was initially derived from Patterson-function arguments [Rius (1993). Acta Cryst. A49, 406-409].
Some uses of predictive probability of success in clinical drug development
Mauro Gasparini; Lilla Di Scala; Frank Bretz; Amy Racine-Poon
2013-01-01
Predictive probability of success is a (subjective) Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations impose...
负二项分布概率最大值的性质%The Characters of the Probability Maximum Value for Negative Binomial Distribution
丁勇
2016-01-01
The character of probability maximum value for negative binomial distribution was explored. The probability maximum value for negative binomial distribution was a function of p and r, where p was the probability of success for each test, and r was the number of the first successful test. It was a mono-tonically increasing continuous function of p when r was given,only (r-1)/p was a integer, its derivative did not exist, and a monotone decreasing function of r when p was given.%负二项分布概率的最大值是每次试验成功的概率p和首次试验成功次数r的函数。对确定的r,该函数是p的单调上升的连续函数,仅当(r-1)/p是整数时不可导；对确定的p,该函数是r的单调下降函数。
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
Review of the Probable Maximum Flood (PMF) Snowmelt Analysis for Success Dam
2015-11-01
watershed; and the USACE reports (1998, 2013) further describe the watershed characteris- tics , vegetation, climate, precipitation, and flooding. The...distribution using HEC-SSP (USACE 2010b). The statistics are listed in Table 5 and shown in Figures 20–22. Figure 23 shows a compari- son of the SWE
Rousseau, Alain N.; Klein, Iris M.; Freudiger, Daphné; Gagnon, Patrick; Frigon, Anne; Ratté-Fortin, Claudie
2014-11-01
Climate change (CC) needs to be accounted for in the estimation of probable maximum floods (PMFs). However, there does not exist a unique way to estimate PMFs and, furthermore the challenge in estimating them is that they should neither be underestimated for safety reasons nor overestimated for economical ones. By estimating PMFs without accounting for CC, the risk of underestimation could be high for Quebec, Canada, since future climate simulations indicate that in all likelihood extreme precipitation events will intensify. In this paper, simulation outputs from the Canadian Regional Climate Model (CRCM) are used to develop a methodology to estimate probable maximum precipitations (PMPs) while accounting for changing climate conditions for the southern region of the Province of Quebec, Canada. The Kénogami and Yamaska watersheds are herein of particular interest, since dam failures could lead to major downstream impacts. Precipitable water (w) represents one of the key variables in the estimation process of PMPs. Results of stationary tests indicate that CC will not only affect precipitation and temperature but also the monthly maximum precipitable water, wmax, and the ensuing maximization ratio used for the estimation of PMPs. An up-to-date computational method is developed to maximize w using a non-stationary frequency analysis, and then calculate the maximization ratios. The ratios estimated this way are deemed reliable since they rarely exceed threshold values set for Quebec, and, therefore, provide consistent PMP estimates. The results show an overall significant increase of the PMPs throughout the current century compared to the recent past.
Estimation of probable maximum typhoon wave for coastal nuclear power plant%滨海核电可能最大台风浪的推算
丁赟
2011-01-01
采用当前国际流行的第三代波浪模式SWAN探讨了滨海核电工程可能最大台风浪的计算,并分析了可能最大台风浪与相伴随的可能最大风暴潮成长规律.分析得可能最大台风浪通常滞后可能最大风暴潮增水峰值,推算得到的可能最大台风浪高于遮浪海洋站观测到的最大波高,为滨海核电工程可能最大台风浪的推算提供参考.%The third-generation wave model, SWAN (Simulating Waves Nearshore), was employed to estimate the probable maximum typhoon wave at a coastal engineering area. The relationship between the development of probable maximum typhoon wave and that of probable maximum storm surge was investigated. It is shown that the probable maximum typhoon wave usually occurs later than the probable maximum storm surge. The estimated probable maximum typhoon wave is higher than the historical observational maximum wave height data of Zhelang station. The approach utilized in this study to estimate probable maximum typhoon wave could provide valuable information in design of coastal engineering.
Minimum best success probability by classical strategies for quantum pseudo-telepathy
Gao, Fei; Fang, Wei; Wen, QiaoYan
2014-07-01
Quantum pseudo-telepathy (QPT) is a new type of game where the quantum team can win with certainty while the classical one cannot. It means the advantages of quantum participants over classical ones in game. However, there has been no systematic and formal analysis on the QPT game before. Here we present the formal description of the QPT game and the definition of the most simplified QPT. Based on the above definitions, we simplify a famous QPT game, i.e. the Cabllo game. Then, according to some instances, we analyze the minimum best success probability by classical strategies of the two-player QPT, which reflects the advantage of the quantum strategies. Finally, we prove the best success probability by classical strategies for the most simplified QPT is totally related to the number of all possible question combinations.
Smail, Linda
2016-06-01
The basic task of any probabilistic inference system in Bayesian networks is computing the posterior probability distribution for a subset or subsets of random variables, given values or evidence for some other variables from the same Bayesian network. Many methods and algorithms have been developed to exact and approximate inference in Bayesian networks. This work compares two exact inference methods in Bayesian networks-Lauritzen-Spiegelhalter and the successive restrictions algorithm-from the perspective of computational efficiency. The two methods were applied for comparison to a Chest Clinic Bayesian Network. Results indicate that the successive restrictions algorithm shows more computational efficiency than the Lauritzen-Spiegelhalter algorithm.
Ignorance is not bliss: Statistical power is not probability of trial success.
Zierhut, M L; Bycott, P; Gibbs, M A; Smith, B P; Vicini, P
2016-04-01
The purpose of this commentary is to place probability of trial success, or assurance, in the context of decision making in drug development, and to illustrate its properties in an intuitive manner for the readers of Clinical Pharmacology and Therapeutics. The hope is that this will stimulate a dialog on how assurance should be incorporated into a quantitative decision approach for clinical development and trial design that uses all available information.
Probability of successful larval dispersal declines fivefold over 1 km in a coral reef fish.
Buston, Peter M; Jones, Geoffrey P; Planes, Serge; Thorrold, Simon R
2012-05-22
A central question of marine ecology is, how far do larvae disperse? Coupled biophysical models predict that the probability of successful dispersal declines as a function of distance between populations. Estimates of genetic isolation-by-distance and self-recruitment provide indirect support for this prediction. Here, we conduct the first direct test of this prediction, using data from the well-studied system of clown anemonefish (Amphiprion percula) at Kimbe Island, in Papua New Guinea. Amphiprion percula live in small breeding groups that inhabit sea anemones. These groups can be thought of as populations within a metapopulation. We use the x- and y-coordinates of each anemone to determine the expected distribution of dispersal distances (the distribution of distances between each and every population in the metapopulation). We use parentage analyses to trace recruits back to parents and determine the observed distribution of dispersal distances. Then, we employ a logistic model to (i) compare the observed and expected dispersal distance distributions and (ii) determine the relationship between the probability of successful dispersal and the distance between populations. The observed and expected dispersal distance distributions are significantly different (p probability of successful dispersal between populations decreases fivefold over 1 km. This study provides a framework for quantitative investigations of larval dispersal that can be applied to other species. Further, the approach facilitates testing biological and physical hypotheses for the factors influencing larval dispersal in unison, which will advance our understanding of marine population connectivity.
Yigzaw, W. Y.; Hossain, F.
2014-12-01
Unanticipated peak inflows that can exceed the inflow design flood (IDF) for spillways and result in possible storage loss in reservoirs from increased sedimentation rates lead to a greater risk for downstream floods. Probable maximum precipitation (PMP) and probable maximum flood (PMF) are mostly used to determine IDF. Any possible change of PMP and PMF due to future land use and land cover (LULC) change therefore requires a methodical investigation. However, the consequential sediment yield, due to altered precipitation and flow patterns into the reservoir has not been addressed in literature. Thus, this study answers the following question: "What is the combined impact of a modified PMP on PMF and sediment yield for an artificial reservoir? The Owyhee dam of Owyhee River watershed (ORW) in Oregon is selected as a case study area for understanding the impact of LULC change on PMF and sedimentation rates. Variable Infiltration Capacity (VIC) is used for simulating stream flow (PMF) and the Revised Universal Soil Loss Equation (RUSLE) to estimate sediment yield over ORW as a result of change in precipitation intensity and LULC. Scenarios that represent pre-Owyhee dam (Pre-Dam) and post Owyhee dam (Non-Irrigation, Control, 1992, 2001, 2006) are used to simulate PMF's and consequential sediment yield. Peak PMF result for Pre-Dam scenarios is found to increase by 26m3s-1 (1%) and 81m3s-1 (3%) from Non-Irrigation and Control scenario, respectively. Considering only LULC change, sediment yield decreased over ORW due to the transformation of LULC from grassland to shrubland (from Pre-Dam period to the post-Dam years). However, increase in precipitation intensity caused a significant (0.1% storage loss over 21days storm period) increase in sediment yield resulting in largely reservoir sedimentation. This study underscores the need to consider future impact of LULC change on IDF calculation as well as sedimentation rates for more robust reservoir operations and planning.
Lavé, Thierry; Caruso, Antonello; Parrott, Neil; Walz, Antje
In this review we present ways in which translational PK/PD modeling can address opportunities to enhance probability of success in drug discovery and early development. This is achieved by impacting efficacy and safety-driven attrition rates, through increased focus on the quantitative understanding and modeling of translational PK/PD. Application of the proposed principles early in the discovery and development phases is anticipated to bolster confidence of successfully evaluating proof of mechanism in humans and ultimately improve Phase II success. The present review is centered on the application of predictive modeling and simulation approaches during drug discovery and early development, and more specifically of mechanism-based PK/PD modeling. Case studies are presented, focused on the relevance of M&S contributions to real-world questions and the impact on decision making.
Rastogi, Deeksha; Kao, Shih-Chieh; Ashfaq, Moetasim; Mei, Rui; Kabela, Erik D.; Gangrade, Sudershan; Naz, Bibi S.; Preston, Benjamin L.; Singh, Nagendra; Anantharaj, Valentine G.
2017-05-01
Probable maximum precipitation (PMP), defined as the largest rainfall depth that could physically occur under a series of adverse atmospheric conditions, has been an important design criterion for critical infrastructures such as dams and nuclear power plants. To understand how PMP may respond to projected future climate forcings, we used a physics-based numerical weather simulation model to estimate PMP across various durations and areas over the Alabama-Coosa-Tallapoosa (ACT) River Basin in the southeastern United States. Six sets of Weather Research and Forecasting (WRF) model experiments driven by both reanalysis and global climate model projections, with a total of 120 storms, were conducted. The depth-area-duration relationship was derived for each set of WRF simulations and compared with the conventional PMP estimates. Our results showed that PMP driven by projected future climate forcings is higher than 1981-2010 baseline values by around 20% in the 2021-2050 near-future and 44% in the 2071-2100 far-future periods. The additional sensitivity simulations of background air temperature warming also showed an enhancement of PMP, suggesting that atmospheric warming could be one important factor controlling the increase in PMP. In light of the projected increase in precipitation extremes under a warming environment, the reasonableness and role of PMP deserve more in-depth examination.
Reconstructing phylogenies from noisy quartets in polynomial time with a high success probability
Wu Gang
2008-01-01
Full Text Available Abstract Background In recent years, quartet-based phylogeny reconstruction methods have received considerable attentions in the computational biology community. Traditionally, the accuracy of a phylogeny reconstruction method is measured by simulations on synthetic datasets with known "true" phylogenies, while little theoretical analysis has been done. In this paper, we present a new model-based approach to measuring the accuracy of a quartet-based phylogeny reconstruction method. Under this model, we propose three efficient algorithms to reconstruct the "true" phylogeny with a high success probability. Results The first algorithm can reconstruct the "true" phylogeny from the input quartet topology set without quartet errors in O(n2 time by querying at most (n - 4 log(n - 1 quartet topologies, where n is the number of the taxa. When the input quartet topology set contains errors, the second algorithm can reconstruct the "true" phylogeny with a probability approximately 1 - p in O(n4 log n time, where p is the probability for a quartet topology being an error. This probability is improved by the third algorithm to approximately 11+q2+12q4+116q5 MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaqcfa4aaSaaaeaacqaIXaqmaeaacqaIXaqmcqGHRaWkcqWGXbqCdaahaaqabeaacqaIYaGmaaGaey4kaSYaaSaaaeaacqaIXaqmaeaacqaIYaGmaaGaemyCae3aaWbaaeqabaGaeGinaqdaaiabgUcaRmaalaaabaGaeGymaedabaGaeGymaeJaeGOnaydaaiabdghaXnaaCaaabeqaaiabiwda1aaaaaaaaa@3D5A@, where q=p1−p MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemyCaeNaeyypa0tcfa4aaSaaaeaacqWGWbaCaeaacqaIXaqmcqGHsislcqWGWbaCaaaaaa@3391@, with
Maximum Potential Score (MPS: An operating model for a successful customer-focused strategy.
Cabello González, José Manuel
2015-11-01
Full Text Available One of marketers’ chief objectives is to achieve customer loyalty, which is a key factor for profitable growth. Therefore, they need to develop a strategy that attracts and maintains customers, giving them adequate motives, both tangible (prices and promotions and intangible (personalized service and treatment, to satisfy a customer and make him loyal to the company. Finding a way to accurately measure satisfaction and customer loyalty is very important. With regard to typical Relationship Marketing measures, we can consider listening to customers, which can help to achieve a competitive sustainable advantage. Customer satisfaction surveys are essential tools for listening to customers. Short questionnaires have gained considerable acceptance among marketers as a means to achieve a customer satisfaction measure. Our research provides an indication of the benefits of a short questionnaire (one/three questions. We find that the number of questions survey is significantly related to the participation in the survey (Net Promoter Score or NPS. We also prove that a the three question survey is more likely to have more participants than a traditional survey (Maximum Potential Score or MPS . Our main goal is to analyse one method as a potential predictor of customer loyalty. Using surveys, we attempt to empirically establish the causal factors in determining the satisfaction of customers. This paper describes a maximum potential operating model that captures with a three questions survey, important elements for a successful customer-focused strategy. MPS may give us lower participation rates than NPS but important information that helps to convert unhappy customers or just satisfied customers, into loyal customers.
Ironside, Kirsten E.; Mattson, David J.; Choate, David; Stoner, David; Arundel, Terry; Hansen, Jered R.; Theimer, Tad; Holton, Brandon; Jansen, Brian; Sexton, Joseph O.; Longshore, Kathleen; Edwards, Thomas C.; Peters, Michael
2017-01-01
Studies using global positioning system (GPS) telemetry rarely result in 100% fix success rates (FSR), which may bias datasets because data loss is systematic rather than a random process. Previous spatially explicit models developed to correct for sampling bias have been limited to small study areas, a small range of data loss, or were study-area specific. We modeled environmental effects on FSR from desert to alpine biomes, investigated the full range of potential data loss (0–100% FSR), and evaluated whether animal body position can contribute to lower FSR because of changes in antenna orientation based on GPS detection rates for 4 focal species: cougars (Puma concolor), desert bighorn sheep (Ovis canadensis nelsoni), Rocky Mountain elk (Cervus elaphus nelsoni), and mule deer (Odocoileus hemionus). Terrain exposure and height of over story vegetation were the most influential factors affecting FSR. Model evaluation showed a strong correlation (0.88) between observed and predicted FSR and no significant differences between predicted and observed FSRs using 2 independent validation datasets. We found that cougars and canyon-dwelling bighorn sheep may select for environmental features that influence their detectability by GPS technology, mule deer may select against these features, and elk appear to be nonselective. We observed temporal patterns in missed fixes only for cougars. We provide a model for cougars, predicting fix success by time of day that is likely due to circadian changes in collar orientation and selection of daybed sites. We also provide a model predicting the probability of GPS fix acquisitions given environmental conditions, which had a strong relationship (r 2 = 0.82) with deployed collar FSRs across species.
Bulat, Tanja; Smidak, Roman; Sialana, Fernando J; Jung, Gangsoo; Rattei, Thomas; Bilban, Martin; Sattmann, Helmut; Lubec, Gert; Aradska, Jana
2016-01-01
The Spanish slug, Arion vulgaris, is considered one of the hundred most invasive species in Central Europe. The immense and very successful adaptation and spreading of A. vulgaris suggest that it developed highly effective mechanisms to deal with infections and natural predators. Current transcriptomic and proteomic studies on gastropods have been restricted mainly to marine and freshwater gastropods. No transcriptomic or proteomic study on A. vulgaris has been carried out so far, and in the current study, the first transcriptomic database from adult specimen of A. vulgaris is reported. To facilitate and enable proteomics in this non-model organism, a mRNA-derived protein database was constructed for protein identification. A gel-based proteomic approach was used to obtain the first generation of a comprehensive slug mantle proteome. A total of 2128 proteins were unambiguously identified; 48 proteins represent novel proteins with no significant homology in NCBI non-redundant database. Combined transcriptomic and proteomic analysis revealed an extensive repertoire of novel proteins with a role in innate immunity including many associated pattern recognition, effector proteins and cytokine-like proteins. The number and diversity in gene families encoding lectins point to a complex defense system, probably as a result of adaptation to a pathogen-rich environment. These results are providing a fundamental and important resource for subsequent studies on molluscs as well as for putative antimicrobial compounds for drug discovery and biomedical applications.
Bulat, Tanja; Smidak, Roman; Sialana, Fernando J.; Jung, Gangsoo; Rattei, Thomas; Bilban, Martin; Sattmann, Helmut; Lubec, Gert; Aradska, Jana
2016-01-01
The Spanish slug, Arion vulgaris, is considered one of the hundred most invasive species in Central Europe. The immense and very successful adaptation and spreading of A. vulgaris suggest that it developed highly effective mechanisms to deal with infections and natural predators. Current transcriptomic and proteomic studies on gastropods have been restricted mainly to marine and freshwater gastropods. No transcriptomic or proteomic study on A. vulgaris has been carried out so far, and in the current study, the first transcriptomic database from adult specimen of A. vulgaris is reported. To facilitate and enable proteomics in this non-model organism, a mRNA-derived protein database was constructed for protein identification. A gel-based proteomic approach was used to obtain the first generation of a comprehensive slug mantle proteome. A total of 2128 proteins were unambiguously identified; 48 proteins represent novel proteins with no significant homology in NCBI non-redundant database. Combined transcriptomic and proteomic analysis revealed an extensive repertoire of novel proteins with a role in innate immunity including many associated pattern recognition, effector proteins and cytokine-like proteins. The number and diversity in gene families encoding lectins point to a complex defense system, probably as a result of adaptation to a pathogen-rich environment. These results are providing a fundamental and important resource for subsequent studies on molluscs as well as for putative antimicrobial compounds for drug discovery and biomedical applications. PMID:26986963
Tanja Bulat
Full Text Available The Spanish slug, Arion vulgaris, is considered one of the hundred most invasive species in Central Europe. The immense and very successful adaptation and spreading of A. vulgaris suggest that it developed highly effective mechanisms to deal with infections and natural predators. Current transcriptomic and proteomic studies on gastropods have been restricted mainly to marine and freshwater gastropods. No transcriptomic or proteomic study on A. vulgaris has been carried out so far, and in the current study, the first transcriptomic database from adult specimen of A. vulgaris is reported. To facilitate and enable proteomics in this non-model organism, a mRNA-derived protein database was constructed for protein identification. A gel-based proteomic approach was used to obtain the first generation of a comprehensive slug mantle proteome. A total of 2128 proteins were unambiguously identified; 48 proteins represent novel proteins with no significant homology in NCBI non-redundant database. Combined transcriptomic and proteomic analysis revealed an extensive repertoire of novel proteins with a role in innate immunity including many associated pattern recognition, effector proteins and cytokine-like proteins. The number and diversity in gene families encoding lectins point to a complex defense system, probably as a result of adaptation to a pathogen-rich environment. These results are providing a fundamental and important resource for subsequent studies on molluscs as well as for putative antimicrobial compounds for drug discovery and biomedical applications.
Perceived probability of success and motivational basis for family planning programme. Part I.
Kar, S B
1968-11-01
The role of family planning programs in the context of total national developmental efforts is reviewed. It is suggested that the effective implementation of family planning programs should be supplemented by maintaining the progress made in other developmental areas. Depth studies will constitute effective incentives (other than monetary) for the adoption of family planning by the masses where no tangible improvement in the standard of living is immediately possible. The perceived probability of success of program goals can significantly influence the workers' dedication to the work and their actual performances. The present empirical pilot study indicates that the majority of the respondents to the questionnaire felt that the reduction in the birthrate from 41 to 25/1000 is not likely to be achieved in 10 years, but that one may expect a significant decline in the birthrate in the next 20 years. It should be determined whether it is desirable to relax the program goals or orientate the workers to develop in them the conviction that the attainability of the present program goals is feasible.
Chang, Xiao-Wen; Xie, Xiaohu
2012-01-01
The common method to estimate an unknown integer parameter vector in a linear model is to solve an integer least squares (ILS) problem. A typical approach to solving an ILS problem is sphere decoding. To make a sphere decoder faster, the well-known LLL reduction is often used as preprocessing. The Babai point produced by the Babai nearest plan algorithm is a suboptimal solution of the ILS problem. First we prove that the success probability of the Babai point as a lower bound on the success probability of the ILS estimator is sharper than the lower bound given by Hassibi and Boyd [1]. Then we show rigorously that applying the LLL reduction algorithm will increase the success probability of the Babai point. Finally we show rigorously that applying the LLL reduction algorithm will also reduce the computational complexity of sphere decoders, which is measured approximately by the number of nodes in the search tree in the literature
Gholamreza Norouzi
2015-01-01
Full Text Available In project management context, time management is one of the most important factors affecting project success. This paper proposes a new method to solve research project scheduling problems (RPSP containing Fuzzy Graphical Evaluation and Review Technique (FGERT networks. Through the deliverables of this method, a proper estimation of project completion time (PCT and success probability can be achieved. So algorithms were developed to cover all features of the problem based on three main parameters “duration, occurrence probability, and success probability.” These developed algorithms were known as PR-FGERT (Parallel and Reversible-Fuzzy GERT networks. The main provided framework includes simplifying the network of project and taking regular steps to determine PCT and success probability. Simplifications include (1 equivalent making of parallel and series branches in fuzzy network considering the concepts of probabilistic nodes, (2 equivalent making of delay or reversible-to-itself branches and impact of changing the parameters of time and probability based on removing related branches, (3 equivalent making of simple and complex loops, and (4 an algorithm that was provided to resolve no-loop fuzzy network, after equivalent making. Finally, the performance of models was compared with existing methods. The results showed proper and real performance of models in comparison with existing methods.
Tolsma, J.; Need, A.; Jong, U. de
2010-01-01
In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus
Tolsma, J.; Need, A.; Jong, U. de
2010-01-01
In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus profession
Tolsma, J.; Need, A.; Jong, U. de
2010-01-01
In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students' choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus profession
玉米秆热解的最概然机理%Maximum probability mechanisms of pyrolysis of corn stalk
李小民; 林其钊
2012-01-01
The mechanisms of pyrolysis of corn stalk were investigated by means of both model method and model-free method, and following the idea of dual extrapolation method at the same time. Firstly, a thermogravimetry experiment was performed to investigate the process of pyrolysis of corn stalk, in nitrogen gas and at various heating rates. Secondly, an analysis, combining model method and model-free method, was utilized to understand the mechanisms of pyrolysis of corn stalk. The several mechanisms were screened by means of the Popescu method. Then, the kinetic parameters are obtained by the integral formula proposed by Tang and the iso-conversion methods respectively. Moreover, the aforementioned results are all extrapolated following the idea of dual extrapolation method. It is found from the comparison that 2 mechanisms are both probable to satisfy pyrolysis of corn stalk. Furthermore, a preliminary comparison was carried out between the results from the two models and the experimental data. According to the comparison, combined the status of pyrolysis research of biomass, it was concluded that the mechanisms for corn stalk follow reaction of 2. 5-order. And it was obtained that the activation energy, 97. 72 kJ · mol-1 , and the frequency factor, 3661231 s-1. Finally, an analysis was conducted concerning the discrepancy between the calculation and the experiment.
Weiser, Deborah Anne
Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.
Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
A Framework for the Statistical Analysis of Probability of Mission Success Based on Bayesian Theory
2014-06-01
times, launch points and other controlled events, it is possible to optimise the result to find the values of the fixed events for which there is the...the function. To reduce the number of iterations in the function, a method was devised to assess whether each product in the summation was equal to...zero based on whether any of the individual probabilities calculated were equal to zero. These products are unnecessary for the summation, and thus
Yi-Hua Zhu; Ding-Hua Shi; Yong Xiong; Ji Gao; He-Zhi Luo
2004-01-01
Mobility management is a challenging topic in mobile computing environment. Studying the situation of mobiles crossing the boundaries of location areas is significant for evaluating the costs and performances of various location management strategies. Hitherto, several formulae were derived to describe the probability of the number of location areas' boundaries crossed by a mobile. Some of them were widely used in analyzing the costs and performances of mobility management strategies. Utilizing the density evolution method of vector Markov processes, we propose a general probability formula of the number of location areas' boundaries crossed by a mobile between two successive calls. Fortunately, several widely-used formulae are special cases of the proposed formula.
Nguyen, Truong-Huy; El Outayek, Sarah; Lim, Sun Hee; Nguyen, Van-Thanh-Van
2017-10-01
Many probability distributions have been developed to model the annual maximum rainfall series (AMS). However, there is no general agreement as to which distribution should be used due to the lack of a suitable evaluation method. This paper presents hence a general procedure for assessing systematically the performance of ten commonly used probability distributions in rainfall frequency analyses based on their descriptive as well as predictive abilities. This assessment procedure relies on an extensive set of graphical and numerical performance criteria to identify the most suitable models that could provide the most accurate and most robust extreme rainfall estimates. The proposed systematic assessment approach has been shown to be more efficient and more robust than the traditional model selection method based on only limited goodness-of-fit criteria. To test the feasibility of the proposed procedure, an illustrative application was carried out using 5-min, 1-h, and 24-h annual maximum rainfall data from a network of 21 raingages located in the Ontario region in Canada. Results have indicated that the GEV, GNO, and PE3 models were the best models for describing the distribution of daily and sub-daily annual maximum rainfalls in this region. The GEV distribution, however, was preferred to the GNO and PE3 because it was based on a more solid theoretical basis for representing the distribution of extreme random variables.
Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel
2016-11-01
The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor (k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series.
2016-01-01
The Spanish slug, Arion vulgaris, is considered one of the hundred most invasive species in Central Europe. The immense and very successful adaptation and spreading of A. vulgaris suggest that it developed highly effective mechanisms to deal with infections and natural predators. Current transcriptomic and proteomic studies on gastropods have been restricted mainly to marine and freshwater gastropods. No transcriptomic or proteomic study on A. vulgaris has been carried out so far, and in the ...
Klein, I. M.; Rousseau, A. N.; Gagnon, P.; Frigon, A.
2012-12-01
Probable Maximum Snow Accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood. A robust methodology for evaluating the PMSA is imperative so the resulting spring probable maximum flood is neither overestimated, which would mean financial losses, nor underestimated, which could affect public safety. In addition, the impact of climate change needs to be considered since it is known that solid precipitation in some Nordic landscapes will in all likelihood intensify over the next century. In this paper, outputs from different simulations produced by the Canadian Regional Climate Model are used to estimate PMSAs for southern Quebec, Canada (44.1°N - 49.1°N; 68.2°W - 75.5°W). Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationary tests indicate that climate change will not only affect precipitation and temperature but also the monthly maximum precipitable water and the ensuing maximization ratio r. The maximization ratio r is used to maximize "efficient" snowfall events; and represents the ratio of the 100-year precipitable water of a given month divided by the snowstorm precipitable water. A computational method was developed to maximize precipitable water using a non-stationary frequency analysis. The method was carefully adapted to the spatial and temporal constraints embedded in the resolution of the available simulation data. For example, for a given grid cell and time step, snow and rain may occur simultaneously. In this case, the focus is restricted to snow and snow-storm-conditions only, thus rainfall and humidity that could lead to rainfall are neglected. Also, the temporal resolution cannot necessarily capture the full duration of actual snow storms. The threshold for a snowstorm to be maximized and the duration resulting from considered time steps are adjusted in order to obtain a high percentage of maximization ratios below
Global Trends in Alzheimer Disease Clinical Development: Increasing the Probability of Success.
Sugino, Haruhiko; Watanabe, Akihito; Amada, Naoki; Yamamoto, Miho; Ohgi, Yuta; Kostic, Dusan; Sanchez, Raymond
2015-08-01
Alzheimer disease (AD) is a growing global health and economic issue as elderly populations increase dramatically across the world. Despite the many clinical trials conducted, currently no approved disease-modifying treatment exists. In this commentary, the present status of AD drug development and the grounds for collaborations between government, academia, and industry to accelerate the development of disease-modifying AD therapies are discussed. Official government documents, literature, and news releases were surveyed by MEDLINE and website research. Currently approved anti-AD drugs provide only short-lived symptomatic improvements, which have no effect on the underlying pathogenic mechanisms or progression of the disease. The failure to approve a disease-modifying drug for AD may be because the progression of AD in the patient populations enrolled in clinical studies was too advanced for drugs to demonstrate cognitive and functional improvements. The US Food and Drug Administration and the European Medicines Agency recently published draft guidance for industry which discusses approaches for conducting clinical studies with patients in early AD stages. For successful clinical trials in early-stage AD, however, it will be necessary to identify biomarkers highly correlated with the clinical onset and the longitudinal progress of AD. In addition, because of the high cost and length of clinical AD studies, support in the form of global initiatives and collaborations between government, industry, and academia is needed. In response to this situation, national guidance and international collaborations have been established. Global initiatives are focusing on 2025 as a goal to provide new treatment options, and early signs of success in biomarker and drug development are already emerging. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.
Livingston, Richard A.; Jin, Shuang
2005-05-01
Bridges and other civil structures can exhibit nonlinear and/or chaotic behavior under ambient traffic or wind loadings. The probability density function (pdf) of the observed structural responses thus plays an important role for long-term structural health monitoring, LRFR and fatigue life analysis. However, the actual pdf of such structural response data often has a very complicated shape due to its fractal nature. Various conventional methods to approximate it can often lead to biased estimates. This paper presents recent research progress at the Turner-Fairbank Highway Research Center of the FHWA in applying a novel probabilistic scaling scheme for enhanced maximum entropy evaluation to find the most unbiased pdf. The maximum entropy method is applied with a fractal interpolation formulation based on contraction mappings through an iterated function system (IFS). Based on a fractal dimension determined from the entire response data set by an algorithm involving the information dimension, a characteristic uncertainty parameter, called the probabilistic scaling factor, can be introduced. This allows significantly enhanced maximum entropy evaluation through the added inferences about the fine scale fluctuations in the response data. Case studies using the dynamic response data sets collected from a real world bridge (Commodore Barry Bridge, PA) and from the simulation of a classical nonlinear chaotic system (the Lorenz system) are presented in this paper. The results illustrate the advantages of the probabilistic scaling method over conventional approaches for finding the unbiased pdf especially in the critical tail region that contains the larger structural responses.
OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success
Cheng Zhu
2014-01-01
Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.
Bar-Sela, Gil; Abu-Amna, Mahmoud; Hadad, Salim; Haim, Nissim; Shahar, Eduardo
2015-09-01
Vemurafenib and dabrafenib are both orally bioavailable small molecule agents that block mitogen activated protein kinase signalling in patients with melanoma and BRAF(V600E) mutation. Generalized hypersensitivity reactions to vemurafenib or dabrafenib have not been described. Continuing vemurafenib or dabrafenib therapy despite hypersensitivity reaction is especially important in patients with melanoma and BRAF(V600E) mutation, in whom this mutation plays a critical role in tumour growth. Desensitization protocols to overcome hypersensitivity reactions by gradual reintroduction of small amounts of the offending drug up to full therapeutic doses are available for many anti-cancer agents, including vemurafenib but, to the best of our knowledge, have not been reported for dabrafenib. We describe a patient with metastatic melanoma who developed Type I hypersensitivity reaction to vemurafenib and to subsequent treatment with dabrafenib, and who was successfully treated by drug desensitization which allowed safe prolonged continuation of dabrafenib. The development of hypersensitivity reactions for both dabrafenib and vemurafinib in the current case could be because these drugs have a similar chemical structure and cause a cross-reactivity. However, hypersensitivity reaction to a non-medicinal ingredient shared by the two drugs is also possible. Oral desensitization appears to be an option for patients with hypersensitivity Type I to dabrafenib. This approach may permit clinicians to safely administer dabrafenib to patients who experience hypersensitivity reactions to this life-prolonging medication. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Gian Paolo Beretta
2008-08-01
Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.
Beretta, Gian P.
2008-09-01
A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.
Rufibach, Kaspar; Burger, Hans Ulrich; Abt, Markus
2016-09-01
Bayesian predictive power, the expectation of the power function with respect to a prior distribution for the true underlying effect size, is routinely used in drug development to quantify the probability of success of a clinical trial. Choosing the prior is crucial for the properties and interpretability of Bayesian predictive power. We review recommendations on the choice of prior for Bayesian predictive power and explore its features as a function of the prior. The density of power values induced by a given prior is derived analytically and its shape characterized. We find that for a typical clinical trial scenario, this density has a u-shape very similar, but not equal, to a β-distribution. Alternative priors are discussed, and practical recommendations to assess the sensitivity of Bayesian predictive power to its input parameters are provided. Copyright © 2016 John Wiley & Sons, Ltd.
Davis, Matthew M; Butchart, Amy T; Wheeler, John R C; Coleman, Margaret S; Singer, Dianne C; Freed, Gary L
2011-11-28
Research and development of prophylactic vaccines carries a high risk of failure. In the past, industry experts have asserted that vaccines are riskier to produce than other pharmaceuticals. This assertion has not been critically examined. We assessed outcomes in pharmaceutical research and development from 1995 to 2011, using a global pharmaceutical database to identify prophylactic vaccines versus other pharmaceuticals in preclinical, Phase I, Phase II, or Phase III stages of development. Over 16 years of follow-up for 4367 products (132 prophylactic vaccines; 4235 other pharmaceuticals), we determined the failure-to-success ratios for prophylactic vaccines versus all other products. The overall ratio of failures to successes for prophylactic vaccines for the 1995 cohort over 16 years of follow-up was 8.3 (116/14) versus 7.7 (3650/475) for other pharmaceuticals. The probability of advancing through the development pipeline at each point was not significantly different for prophylactic vaccines than for other pharmaceuticals. Phase length was significantly longer for prophylactic vaccines than other pharmaceuticals for preclinical development (3.70 years vs 2.80 years; pvaccines are not significantly different from those of other pharmaceutical products, which may partially explain rapidly growing interest in prophylactic vaccines among major pharmaceutical manufacturers.
Wright, Alison J; Aveyard, Paul; Guo, Boliang; Murphy, Michael; Brown, Karen; Marteau, Theresa M
2007-10-01
Pharmacogenetic smoking cessation interventions would involve smokers being given information about the influence of genes on their behaviour. However, attributing smoking to genetic causes may reduce perceived control over smoking, reducing quit attempt success. This study examines whether attributing smoking to genetic influences is associated with reduced quitting and whether this effect is mediated by perceived control over smoking. Cohort study. A total of 792 smokers, participating in a trial of nicotine replacement therapy (NRT)-assisted smoking cessation. Participants were informed that the trial investigated relationships between genetic markers and smoking behaviour, but personalized genetic feedback was not provided. Primary care in Oxfordshire and Buckinghamshire, UK. Perceived control over smoking and perceived importance of genetic factors in causing smoking assessed pre-quit; abstinence 4, 12, 26 and 52 weeks after the start of treatment. A total of 515 smokers (65.0%) viewed genetic factors as playing some role in causing their smoking. They had lower perceived control over smoking than smokers who viewed genetic factors as having no role in causing their smoking. Attributing smoking to genetic causes was not associated significantly with a lower probability of quit attempt success. Attributing smoking to genetic factors was associated with lower levels of perceived control over smoking but not lower quit rates. This suggests that learning of one's genetic predisposition to smoking during a pharmacogenetically tailored smoking cessation intervention may not deter quitting. Further research should examine whether the lack of impact of genetic attributions on quit attempt success is also found in smokers provided with personalized genetic feedback.
T. Gnanasekaran
2008-01-01
Full Text Available Problem statement: In this study we propose a method to improve the performance of Maximum A-Posteriori Probability Algorithm, which is used in turbo decoder. Previously the performance of turbo decoder is improved by means of scaling the channel reliability value. Approach: A modification in MAP algorithm proposed in this study, which achieves further improvement in forward error correction by means of scaling the extrinsic information in both decoders without introducing any complexity. The encoder is modified with a new puncturing matrix, which yields Unequal Error Protection (UEP. This modified MAP algorithm is analyzed with the traditional turbo code system Equal Error Protection (EEP and also with Unequal Error Protection (UEP both in AWGN channel and fading channel. Result: MAP and modified MAP achieve coding gain of 0.6 dB over EEP in AWGN channel. The MAP and modified MAP achieve coding gain of 0.4 dB and 0.9dB over EEP respectively in Rayleigh fading channel. Modified MAP in UEP class 1 and class 2 gained 0.8 dB and 0.6 dB respectively in AWGN channel where as in fading channel class 1 and 2 gained 0.4 dB and 0.6 dB respectively. Conclusion/Recommendations: The modified MAP algorithm improves the Bit Error Rate (BER performance in EEP as well as UEP both in AWGN and fading channels. We propose modified MAP error correction algorithm with UEP for broad band communication.
Ennis, Erin J; Foley, Joe P
2016-07-15
A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approachstochastic approach≤probability predicted by Davis and Stoll stochastic results are applied to conventional HPLC and sequential elution liquid chromatography (SE-LC), the latter is shown to provide much greater probabilities of success for moderately complex samples (e.g., PHPLC=31.2% versus PSE-LC=69.1% for 12 components and the same analysis time). For a given number of components, the density of probability data provided over the range of peak capacities is sufficient to allow accurate interpolation of probabilities for peak capacities not reported, stochastic approach include isothermal and programmed-temperature gas chromatography.
Zhao, W.; Cella, M.; Pasqua, O. Della; Burger, D.M.; Jacqz-Aigrain, E.
2012-01-01
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT: Abacavir is used to treat HIV infection in both adults and children. The recommended paediatric dose is 8 mg kg(-1) twice daily up to a maximum of 300 mg twice daily. Weight was identified as the central covariate influencing pharmacokinetics of abacavir in
GH. ŞERBAN
2016-03-01
Full Text Available The purpose of the paper is to identify and locate some species related to habitats from Pricop-Huta-Certeze and Upper Tisa Natura 2000 Protected Areas (PHCTS and to determine if they are vulnerable to risks induced by maximum flow phases. In the first chapter are mentioned few references about the morphometric parameters of the hydrographic networks within the study area, as well as some references related to the maximum flow phases frequency. After the second chapter, where methods and databases used in the study are described, we proceed to the identification of the areas that are covered by water during flood, as well as determining the risk level related to these areas. The GIS modeling reveals small extent of the flood high risk for natural environment related to protected areas and greater extent for the anthropic environment. The last chapter refers to several species of fish and batrachia, as well as to those amphibious mammals identified in the study area that are vulnerable to floods (high turbidity effect, reduction of dissolved oxygen quantity, habitats destruction etc..
Blackwood, R.L.
1980-05-15
There are now available sufficient data from in-situ, pre-mining stress measurements to allow a first attempt at predicting the maximum stress magnitudes likely to occur in a given mining context. The sub-horizontal (lateral) stress generally dominates the stress field, becoming critical to stope stability in many cases. For cut-and-fill mining in particular, where developed fill pressures are influenced by lateral displacement of pillars or stope backs, extraction maximization planning by mathematical modelling techniques demands the best available estimate of pre-mining stresses. While field measurements are still essential for this purpose, in the present paper it is suggested that the worst stress case can be predicted for preliminary design or feasibility study purposes. In the Eurpoean continent the vertical component of pre-mining stress may be estimated by adding 2 MPa to the pressure due to overburden weight. The maximum lateral stress likely to be encountered is about 57 MPa at depths of some 800m to 1000m below the surface.
He, Weili; Cao, Xiting; Xu, Lu
2012-02-28
The evaluation of clinical proof of concept, optimal dose selection, and phase III probability of success has traditionally been conducted by a subjective and qualitative assessment of the efficacy and safety data. This, in part, was responsible for the numerous failed phase III programs in the past. The need to utilize more quantitative approaches to assess efficacy and safety profiles has never been greater. In this paper, we propose a framework that incorporates efficacy and safety data simultaneously for the joint evaluation of clinical proof of concept, optimal dose selection, and phase III probability of success. Simulation studies were conducted to evaluate the properties of our proposed methods. The proposed approach was applied to two real clinical studies. On the basis of the true outcome of the two clinical studies, the assessment based on our proposed approach suggested a reasonable path forward for both clinical programs.
Wright, Alison J.; Aveyard, Paul; Guo, Boliang; Murphy, Michael; Brown, Karen; Marteau, Theresa M.
2007-01-01
Aims Pharmacogenetic smoking cessation interventions would involve smokers being given information about the influence of genes on their behaviour. However, attributing smoking to genetic causes may reduce perceived control over smoking, reducing quit attempt success. This study examines whether attributing smoking to genetic influences is associated with reduced quitting and whether this effect is mediated by perceived control over smoking. Design Cohort study. Participants A total of 792 sm...
Mahjoub, Reza; Ødegaard, Fredrik; Zaric, Gregory S
2017-06-19
We analyze a game-theoretic model of a risk-sharing agreement between a payer and a pharmaceutical firm. The drug manufacturer chooses the price while the payer sets the rebate rate and decides which patients are eligible for treatment. The manufacturer provides the payer with a rebate for nonresponding patients. We generalize on the existing literature, by making both price and rebate rate decision variables, allowing the rebate rate to be different from 100%, and incorporating 2 types of administrative costs. We identify a threshold for the expected probability of response for classifying the drug as a mass-market or niche type and investigate the optimal solutions for both types. We also identify a threshold for the rebate rate at which the net benefits become equal for responding and nonresponding patients. Through numerical examples, we examine how various parameters impact the drug manufacturer's and the payer's optimal solution. Copyright © 2017 John Wiley & Sons, Ltd.
Kinkhabwala, Ali
2013-01-01
The most fundamental problem in statistics is the inference of an unknown probability distribution from a finite number of samples. For a specific observed data set, answers to the following questions would be desirable: (1) Estimation: Which candidate distribution provides the best fit to the observed data?, (2) Goodness-of-fit: How concordant is this distribution with the observed data?, and (3) Uncertainty: How concordant are other candidate distributions with the observed data? A simple unified approach for univariate data that addresses these traditionally distinct statistical notions is presented called "maximum fidelity". Maximum fidelity is a strict frequentist approach that is fundamentally based on model concordance with the observed data. The fidelity statistic is a general information measure based on the coordinate-independent cumulative distribution and critical yet previously neglected symmetry considerations. An approximation for the null distribution of the fidelity allows its direct conversi...
梁莉; 赵琳娜; 巩远发; 包红军; 王成鑫; 王志
2011-01-01
density function of gamma distribution, to a certain extent, overcomes the influence that the random oscillation of the samples caused to the estimation of daily precipitation probability distribution.The probability distribution of maximum daily precipitation in 1 to 20 days derived from gamma distribution function is reasonable. The curve of precipitation probability density of 1 day is monotonically decreasing which takes on the trend of reverse "J". The peak of probability distribution of the maximum daily precipitation in 10 days or 20 days tilts toward the side of large precipitation with the days increased.From the Huaihe Basin's probability distribution of the maximum daily precipitation more than 10 mm, 25 mm, 50 mm in 10 or 20 days, it indicates that the probability of the upper stream of the Huaihe River, the Huaihe River downstream below Hongze Lake, and the Yishu River watershed are evidently higher than the rest regions of the five catchments, which means that the maximum daily precipitation of these areas is more likely to be over 10 mm, 25 mm, 50 mm in 10 or 20 days.The high values of the probability of the maximum daily precipitation over 50 mm in 10 days locates in the eastern part of Yishu River watershed and the upper stream of the Huaihe River, while the high values of the probability of the maximum daily precipitation over 50 mm in 20 days locates in the junction of the downstream of the Huaihe River and eastern of the Hongze Lake, indicating large precipitation are more likely to occur in these areas. This approach can provide practical applications such as decision supports for the management of hydro-meteorological forecasting, agricultural, and water resources management.
刘泽龙; 刘文彦; 丁宏; 黄晓涛
2012-01-01
In dense multipath channels, a practical ultra-wideband ( UWB) ranging system usually adopts time-of-arrival-based (TOA-based) energy detection (ED) receiver. The accuracy of TOA estimation determines the accuracy of ranging. Threshold comparison is used a lot in ED and the designing of threshold affects the accuracy of TOA estimation greatly. In this paper, we apply the maximum probability of detection (MPD) method to the energy-based TOA estimators. By improving the method of calculating the probability of detection, a new criteria to determine the threshold value and detect the direct path ( DP) is proposed. The algorithm calculates the probability of detection of DP during searching the optimal threshold. The DP is determined when the difference of the DP detection probability reaches maximum. And then the optimal threshold is also acquired. The paper analyzes theoretically and presents the novel procedure of the proposed TOA estimation method. In the Simulation, we compare our method with other methods. Simulation results show that our method outperforms others and verify the effectiveness of our method.%密集多径信道下,较为实际的超宽带(ultra-wideband,UWB)测距系统一般采用基于到达时间(time of arrival,TOA)的能量检测器(energy detector,ED)接收机.TOA的估计精度决定着测距精度.在ED中多采用阈值比较的方法来估计TOA,阈值的设计对TOA估计的精度有着非常重要的影响.本文将最大概率检测(maximumprobability of detection,MPD)算法应用于基于ED的TOA估计器中,并对样本检测概率计算方法进行改进,在搜索阈值的基础上计算出每次正确检测到直达路径( direct path,DP)的概率,提出一种阈值确定和DP检测新准则,即把相邻两个DP检测概率差值最大时对应的路径作为DP,则该次搜索所对应的阈值即为最佳阈值.文中给出了这种准则的理论分析及TOA估计算法流程.最后通过仿真比较考察了不同信噪比下
陈海涛; 黄鑫; 邱林; 王文川
2013-01-01
提出了构建综合考虑自然因素与农作物生长周期之间量化关系的干旱度评价指标，并基于最大熵原理建立了项目区干旱度分布密度函数，避免了以往构建概率分布的随意性，实现了对区域农业干旱度进行量化评价的目的。首先根据作物在非充分灌溉条件下的减产率，建立了干旱程度的量化评价指标，然后通过蒙特卡罗法生成了长系列降雨资料，并计算历年干旱度指标，最后利用最大熵原理，构建了农业干旱度分布的概率分布密度函数。以河南省濮阳市渠村灌区为对象进行了实例计算。结果表明，该模型概念清晰，计算简便实用，结果符合实际，是一种较好的评估方法。%The evaluation index of drought degree,which comprehensively considering the quantitative rela⁃tionship between the crop growing period and natural factors, is presented in this paper. The distribution density function of drought degree has been established based on the maximum-entropy principle. It can avoid the randomness of probability distribution previous constructed and has realized purpose of quantita⁃tive evaluation of agricultural drought degree. Firstly, the quantitative evaluation index of drought degree was established according to the yield reduction rate of deficit irrigation conditions. Secondly,a long series rainfall data were generated by Monte-Carlo method and the past years index of drought degree were calcu⁃lated. Finally, the density function of probability distribution of agricultural drought degree distribution was constructed by using maximum entropy principle. As an example, the calculation results of the distribution of drought degree of agriculture in Qucun irrigation area were presented. The results show that the model provides a better evaluation method with clear concept,simple and practical approach,and reasonable out⁃comes.
王浩; 张权益; 方宝富; 方帅
2013-01-01
Robot emotion modeling is a hot issue in emotion robot research.Based on the emotion psychology knowledge,a dynamic emotion transfer model of the emotion robot is presented with different personalities under different external stimulation.The influences of personality and external stimulation are discussed.The emotion model based on state space is used to describe the emotion states of robot.The emotion transfer process is simulated by hidden Markov model (HMM) process.However,the HMM process can only work out the current probability of the emotion state.To get the concrete emotion state,the maximum similarity matching emotion transfer model based on mapping between state space and probability space is proposed.Firstly,the current emotion probability is calculated by HMM process.Then,the current concrete emotion state is obtained by maximum similarity matching.Different personalities and can be built by adjusting the parameters of the model.The proposed model simulates the transformation process effectively.The experimental results show that the emotion transfer process simulated by the proposed model corresponds with the general rules of human emotion transformation.stimulation%机器人情感建模是研究情感机器人的热点问题.文中以情感心理学知识为基础,模拟具有不同个性的情感机器人在外界刺激作用下情感动态变化的过程,研究个性和外界刺激对情感转移过程的影响.采用基于状态空间的情感空间模型来描述机器人的情感状态,并用HMM过程来模拟情感状态的转移过程.但HMM过程只能求得当前情感状态的概率,为得到具体的情感状态,文中提出一种基于状态空间与概率空间映射的极大相似度匹配的情感转移模型.首先利用HMM过程计算出当前情感概率,然后通过极大相似度匹配来得到转移后具体的情感状态.通过调节模型参数来模拟不同个性和外界刺激,该模型能有效模拟情感状态变化过
Vilberg, Kaia L; Rugg, Michael D
2009-04-01
The present event-related fMRI study addressed the question whether retrieval-related neural activity in lateral parietal cortex is affected by the relative probability of test items. We varied the proportion of old to new items across two test blocks, with 25% of the items being old in one block and 75% being old in the other. Prior to each block, participants (N=18) completed one of two types of study judgment on each of 108 object images. They then performed a source memory test with four response options: studied in task 1, studied in task 2, old but unsure of the study task, and new. Retrieval-related activity in regions previously identified as recollection-sensitive, including the left inferior lateral parietal cortex and bilateral medial temporal cortex, was unaffected by old/new ratio. Generic retrieval success effects--retrieval-related effects common to recognized items attracting either a correct or an incorrect source judgment--were identified in several regions of left superior parietal cortex. These effects dissociated between a middle region of the intraparietal sulcus (IPS), where activity did not interact with ratio, and regions anterior and posterior to the middle IPS where activity was sensitive to old/new ratio. The findings are inconsistent with prior proposals that retrieval-related activity in and around the left middle IPS reflects the relative salience of old and new test items. Rather, they suggest that, as in the case of more inferior left parietal regions, retrieval-related activity in this region reflects processes directly linked to retrieval.
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Cheeseman, Peter; Stutz, John
2005-01-01
A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].
Ito, Tetsuya; Chika, Noriyasu; Yamamoto, Azusa; Ogura, Toshiro; Amano, Kunihiko; Ishiguro, Toru; Fukuchi, Minoru; Kumagai, Youichi; Ishibashi, Keiichiro; Eguchi, Hidetaka; Okazaki, Yasushi; Mochiki, Erito; Ishida, Hideyuki
2016-11-01
A 44-year-old man with familial adenomatous polyposis underwent laparoscopic-assistedtotal proctocolectomy with ilealpouch anal anastomosis(IPAA). Computed tomography conducted 21 months after IPAA demonstrated bilateral hydronephrosis andan intra-abdominal mass with a maximal diameter of 22 cm, leading to a diagnosis of stage IV desmoid disease, according to the classification by Church and associates. Six courses of combination chemotherapy with doxorubicin plus dacarbazine were administered. Computed tomography after chemotherapy demonstrated marked shrinkage of the desmoidtumor with intraabdominal air andfluidcollection extending just below the skin of the ileostomy closure site. Stoollike fluidoverflowedspontaneously through the site of the ileostomy closure andthe abscess cavity was successfully drained. The patient was discharged 30 days after the start of drainage. The patient is doing well 10 months after the drainage without regrowth of the desmoid tumor, even though a cavity-like lesion encapsulatedby a thick wall remains.
李延杰; 刘晓东; 付雅芳
2012-01-01
为了提高设计决策的科学性和稳健性,对不确定因素影响下的飞机总体方案评价问题进行了研究,提出了一个以联合成功率为综合评价指标的决策方法.根据多变量概率理论,定义了联合成功率的数学表达式,研究了联合概率密度函数的计算方法:联合概率模型；依托联合概率分布,将概率设计方法和多准则决策方法进行有机结合,给出了基于联合成功率的飞机总体方案评价步骤.最后,通过算例分析,对该方法的实际应用进行了演示,并验证了联合概率模型的合理性.%In order to deal with the evaluation of aircraft conceptual/preliminary design impacted by a vast amount of uncertainty, then make the result of the design decision making more scientific and robust than traditional method,a new decision-making method which regards joint probability of success of the design as overall evaluation index was developed.According to multi-oariate probability theory,the mathe-matic formula of joint probability of success was presented,and a specific algorithm was introduced that compute the joint probability density function:Joint probability model.By the joint probability distribution function,traditional probabilistic method and multi梒riteria decision making technique were combined together,and the specific operation process of this new method was outlined.Finally,the application of this method in practice was demonstrated via an example,and the rationality of the joint probability density model was illustrated.
Paziewski, Jacek
2016-02-01
The mitigation of ionospheric delay is still of crucial interest in GNSS positioning, especially in precise solutions such as instantaneous RTK positioning. Thus, several effective algorithms and functional models were developed, and also numerous investigations of ionospheric correction properties in RTK positioning have been performed so far. One of the most highly effective approaches in precise relative positioning is the application of the ionosphere-weighted model with network-derived corrections. This contribution investigates the impact of the accuracy of the network ionospheric corrections on time-to-fix in RTK-OTF positioning. Also, an attempt has been made to estimate the desirable accuracy of the network ionospheric corrections, allowing for reliable instantaneous ambiguity resolution. The experiment is based on a multi-baseline GPS RTK positioning supported with network-derived ionospheric corrections for medium length baselines. The results show that in such scenario, the double-differenced ionospheric correction residuals should not exceed ∼1/3 of the L1 wavelength for successful single-epoch ambiguity resolution.
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
马东升; 胡佑德; 戴凤智
2002-01-01
In an actual control system, it is often difficult to find out where the faults are if only based on the outside fault phenomena, acquired frequently from a fault system. So the fault diagnosis by outside fault phenomena is considered. Based on the theory of fuzzy recognition and fault diagnosis, this method only depends on experience and statistical data to set up fuzzy query relationship between the outside phenomena (fault characters) and the fault sources (fault patterns). From this relationship the most probable fault sources can be obtained, to attain the goal of quick diagnosis. Based on the above approach, the standard fuzzy relationship matrix is stored in the computer as a system database. And experiment data are given to show the fault diagnosis results. The important parameters can be on-line sampled and analyzed, and when faults occur, faults can be found, the alarm is given and the controller output is regulated.%为了解决实际控制系统中仅通过系统的故障现象难以确定系统故障元的难题,采用基于模糊识别和故障诊断理论的最大概率法,该方法仅仅依靠经验和统计数据,在外部故障现象和系统故障元之间建立模糊查询关系,从这一关系中可以获得最大故障概率点.将一个标准模糊关系矩阵作为数据库存储在计算机中,并给出了一个系统故障诊断的实验结果.通过以上方法,只要对系统的重要参数进行在线采集和分析,当发生故障时,就可以给出可能的故障元的故障概率,并发出警报.
Font, R.G. (Geoscience Data Management, Dallas, TX (United States))
1994-12-01
In engineering geology, people often labor to standardize and quantify techniques and methods utilized in the solution of critical problems. Applications related to the field of petroleum technology are certainly no exception. For example, risk analysis of petroleum exploratory projects and prospects is often arbitrary and biased. The calculation of prospect risk introduced in this article is designed to remove subjectivity'' by establishing quantitative standards.'' The technique allows the same method of analysis to be applied to prospects worldwide. It may be used as introduced in this paper (as employed by the author) or as a model, since it possesses the flexibility to be modified by individual users as long as standards are internally defined and utilized by all members of an exploration team. Risk analysis has been discussed in much detail. However, an original method for establishing quantitative standards for risk assessment is addressed and introduced in this paper. As defined here, prospect risk (described as the probability of success'' or Ps'') is the product of the following: Ps = (Trap) [times] (Reservoir) [times] (Source) [times] (Recovery) [times] (Timing). Well-defined, quantitative standards must be established if one is to remove subjectivity from risk assessment. In order to establish such standards to be used uniformly by explorationists, each of the above-referenced factors is individually evaluated and numerically defined utilizing the category and ranking system outlined in Table 1. If all five, independent parameters had individual values of 1.00, then the prospective venture would have an overall probability of success of 100 percent. Similarly, if all parameters exhibited values of 0.90, the overall chance of success would equal 77 percent. Values of 0.80 equate to 33 percent, values of 0.70 to 17 percent, values of 0.60 to 8 percent, values of 0.50 to 1 percent, and values of 0.30 to 0.2 percent.
A (simplified) Bluetooth Maximum A posteriori Probability (MAP) receiver
Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.
2003-01-01
In our software-defined radio project we aim at combining two standards: Bluetooth and HiperLAN/2. The Hiper- LAN/2 receiver requires the most computation power in comparison with Bluetooth. We choose to use this computational power also for Bluetooth and look for more advanced demodulation
A (Simplified) Bluetooth Maximum a Posteriori Probability (Map) Receiver
Schiphorst, R.; Hoeksema, F.W.; Slump, C.H.; Barbarossa, Sergio; Scutari, Gesualdo
2003-01-01
In our software-defined radio project, we aim at combining two standards luetooth and HIPERLAN/2. The HIPERLAN/2 receiver requires more computational power than Bluetooth. We choose to use this computational power also for Bluetooth and look for more advanced demodulation algorithms such as a maximu
A (simplified) Bluetooth Maximum A posteriori Probability (MAP) receiver
Schiphorst, R.; Hoeksema, F.W.; Slump, C.H.
2003-01-01
In our software-defined radio project we aim at combining two standards: Bluetooth and HiperLAN/2. The Hiper- LAN/2 receiver requires the most computation power in comparison with Bluetooth. We choose to use this computational power also for Bluetooth and look for more advanced demodulation algorith
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Remizov, Ivan D
2009-01-01
In this note, we represent a subdifferential of a maximum functional defined on the space of all real-valued continuous functions on a given metric compact set. For a given argument, $f$ it coincides with the set of all probability measures on the set of points maximizing $f$ on the initial compact set. This complete characterization lies in the heart of several important identities in microeconomics, such as Roy's identity, Sheppard's lemma, as well as duality theory in production and linear programming.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Contributions to quantum probability
Fritz, Tobias
2010-06-25
finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Maximum Autocorrelation Factorial Kriging
Nielsen, Allan Aasbjerg; Conradsen, Knut; Pedersen, John L.
2000-01-01
This paper describes maximum autocorrelation factor (MAF) analysis, maximum autocorrelation factorial kriging, and its application to irregularly sampled stream sediment geochemical data from South Greenland. Kriged MAF images are compared with kriged images of varimax rotated factors from...
The Testability of Maximum Magnitude
Clements, R.; Schorlemmer, D.; Gonzalez, A.; Zoeller, G.; Schneider, M.
2012-12-01
Recent disasters caused by earthquakes of unexpectedly large magnitude (such as Tohoku) illustrate the need for reliable assessments of the seismic hazard. Estimates of the maximum possible magnitude M at a given fault or in a particular zone are essential parameters in probabilistic seismic hazard assessment (PSHA), but their accuracy remains untested. In this study, we discuss the testability of long-term and short-term M estimates and the limitations that arise from testing such rare events. Of considerable importance is whether or not those limitations imply a lack of testability of a useful maximum magnitude estimate, and whether this should have any influence on current PSHA methodology. We use a simple extreme value theory approach to derive a probability distribution for the expected maximum magnitude in a future time interval, and we perform a sensitivity analysis on this distribution to determine if there is a reasonable avenue available for testing M estimates as they are commonly reported today: devoid of an appropriate probability distribution of their own and estimated only for infinite time (or relatively large untestable periods). Our results imply that any attempt at testing such estimates is futile, and that the distribution is highly sensitive to M estimates only under certain optimal conditions that are rarely observed in practice. In the future we suggest that PSHA modelers be brutally honest about the uncertainty of M estimates, or must find a way to decrease its influence on the estimated hazard.
Probability and Relative Frequency
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Maximum likelihood for genome phylogeny on gene content.
Zhang, Hongmei; Gu, Xun
2004-01-01
With the rapid growth of entire genome data, reconstructing the phylogenetic relationship among different genomes has become a hot topic in comparative genomics. Maximum likelihood approach is one of the various approaches, and has been very successful. However, there is no reported study for any applications in the genome tree-making mainly due to the lack of an analytical form of a probability model and/or the complicated calculation burden. In this paper we studied the mathematical structure of the stochastic model of genome evolution, and then developed a simplified likelihood function for observing a specific phylogenetic pattern under four genome situation using gene content information. We use the maximum likelihood approach to identify phylogenetic trees. Simulation results indicate that the proposed method works well and can identify trees with a high correction rate. Real data application provides satisfied results. The approach developed in this paper can serve as the basis for reconstructing phylogenies of more than four genomes.
Maximum information entropy: a foundation for ecological theory.
Harte, John; Newman, Erica A
2014-07-01
The maximum information entropy (MaxEnt) principle is a successful method of statistical inference that has recently been applied to ecology. Here, we show how MaxEnt can accurately predict patterns such as species-area relationships (SARs) and abundance distributions in macroecology and be a foundation for ecological theory. We discuss the conceptual foundation of the principle, why it often produces accurate predictions of probability distributions in science despite not incorporating explicit mechanisms, and how mismatches between predictions and data can shed light on driving mechanisms in ecology. We also review possible future extensions of the maximum entropy theory of ecology (METE), a potentially important foundation for future developments in ecological theory.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Maximum Entropy in Drug Discovery
Chih-Yuan Tseng
2014-07-01
Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Dynamical Simulation of Probabilities
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Some New Results on Transition Probability
Yu Quan XIE
2008-01-01
In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.
Exact computation of the maximum-entropy potential of spiking neural-network models.
Cofré, R; Cessac, B
2014-05-01
Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...
CORA - emission line fitting with Maximum Likelihood
Ness, J.-U.; Wichmann, R.
2002-07-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Maximum Autocorrelation Factorial Kriging
Nielsen, Allan Aasbjerg; Conradsen, Knut; Pedersen, John L.; Steenfelt, Agnete
2000-01-01
This paper describes maximum autocorrelation factor (MAF) analysis, maximum autocorrelation factorial kriging, and its application to irregularly sampled stream sediment geochemical data from South Greenland. Kriged MAF images are compared with kriged images of varimax rotated factors from an ordinary non-spatial factor analysis, and they are interpreted in a geological context. It is demonstrated that MAF analysis contrary to ordinary non-spatial factor analysis gives an objective discrimina...
A Thermodynamical Approach for Probability Estimation
Isozaki, Takashi
2012-01-01
The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.
On Quantum Conditional Probability
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Probability, Nondeterminism and Concurrency
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Maximum likely scale estimation
Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo
2005-01-01
A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
THE NUCLEAR ENCOUNTER PROBABILITY
SMULDERS, PJM
1994-01-01
This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability in quantum mechanics
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Sekiguchi, Yuki; Hashimoto, Saki; Kobayashi, Amane; Oroguchi, Tomotaka; Nakasako, Masayoshi
2017-09-01
Coherent X-ray diffraction imaging (CXDI) is a technique for visualizing the structures of non-crystalline particles with size in the submicrometer to micrometer range in material sciences and biology. In the structural analysis of CXDI, the electron density map of a specimen particle projected along the direction of the incident X-rays can be reconstructed only from the diffraction pattern by using phase-retrieval (PR) algorithms. However, in practice, the reconstruction, relying entirely on the computational procedure, sometimes fails because diffraction patterns miss the data in small-angle regions owing to the beam stop and saturation of the detector pixels, and are modified by Poisson noise in X-ray detection. To date, X-ray free-electron lasers have allowed us to collect a large number of diffraction patterns within a short period of time. Therefore, the reconstruction of correct electron density maps is the bottleneck for efficiently conducting structure analyses of non-crystalline particles. To automatically address the correctness of retrieved electron density maps, a data analysis protocol to extract the most probable electron density maps from a set of maps retrieved from 1000 different random seeds for a single diffraction pattern is proposed. Through monitoring the variations of the phase values during PR calculations, the tendency for the PR calculations to succeed when the retrieved phase sets converged on a certain value was found. On the other hand, if the phase set was in persistent variation, the PR calculation tended to fail to yield the correct electron density map. To quantify this tendency, here a figure of merit for the variation of the phase values during PR calculation is introduced. In addition, a PR protocol to evaluate the similarity between a map of the highest figure of merit and other independently reconstructed maps is proposed. The protocol is implemented and practically examined in the structure analyses for diffraction patterns
Maximum information photoelectron metrology
Hockett, P; Wollenhaupt, M; Baumert, T
2015-01-01
Photoelectron interferograms, manifested in photoelectron angular distributions (PADs), are a high-information, coherent observable. In order to obtain the maximum information from angle-resolved photoionization experiments it is desirable to record the full, 3D, photoelectron momentum distribution. Here we apply tomographic reconstruction techniques to obtain such 3D distributions from multiphoton ionization of potassium atoms, and fully analyse the energy and angular content of the 3D data. The PADs obtained as a function of energy indicate good agreement with previous 2D data and detailed analysis [Hockett et. al., Phys. Rev. Lett. 112, 223001 (2014)] over the main spectral features, but also indicate unexpected symmetry-breaking in certain regions of momentum space, thus revealing additional continuum interferences which cannot otherwise be observed. These observations reflect the presence of additional ionization pathways and, most generally, illustrate the power of maximum information measurements of th...
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K
2012-04-05
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Maximum Likelihood Associative Memories
Gripon, Vincent; Rabbat, Michael
2013-01-01
Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amo...
Maximum likely scale estimation
Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo
2005-01-01
A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and....../or having different derivative orders. Although the principle is applicable to a wide variety of image models, the main focus here is on the Brownian model and its use for scale selection in natural images. Furthermore, in the examples provided, the simplifying assumption is made that the behavior...... of the measurements is completely characterized by all moments up to second order....
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
F. TopsÃƒÂ¸e
2001-09-01
Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over
Encounter Probability of Individual Wave Height
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Survival probability in patients with liver trauma.
Buci, Skender; Kukeli, Agim
2016-08-01
Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.
Regularized maximum correntropy machine
Wang, Jim Jing-Yan
2015-02-12
In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
A Partially Annotated Bibliography on Prediction of Parole Success and Delinquency
The bibliography was undertaken to review studies that would be useful in estimating the probability of an offender’s post-release success. Offenders ...correctional treatment and returning a maximum number of offenders to duty with the potential for successful adjustment to Army life. In order to attain...this end, offenders who will not or cannot respond to correction treatment need to be identified and separately processed. Effective correctional
Equalized near maximum likelihood detector
2012-01-01
This paper presents new detector that is used to mitigate intersymbol interference introduced by bandlimited channels. This detector is named equalized near maximum likelihood detector which combines nonlinear equalizer and near maximum likelihood detector. Simulation results show that the performance of equalized near maximum likelihood detector is better than the performance of nonlinear equalizer but worse than near maximum likelihood detector.
The constraint rule of the maximum entropy principle
Uffink, J.
2001-01-01
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distribut
Exact probability distribution functions for Parrondo's games
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.
Combining Experiments and Simulations Using the Maximum Entropy Principle
Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten
2014-01-01
are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...
Distribution of maximum loss of fractional Brownian motion with drift
Çağlar, Mine; Vardar-Acar, Ceren
2013-01-01
In this paper, we find bounds on the distribution of the maximum loss of fractional Brownian motion with H >= 1/2 and derive estimates on its tail probability. Asymptotically, the tail of the distribution of maximum loss over [0, t] behaves like the tail of the marginal distribution at time t.
Whiting, Alan B
2014-01-01
Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Generalized Probability Functions
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Probabilities for Solar Siblings
Valtonen, M; Bobylev, V V; Myllari, A
2015-01-01
We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Learning unbelievable marginal probabilities
Pitkow, Xaq; Miller, Ken D
2011-01-01
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...
Maximizing the Detection Probability of Kilonovae Associated with Gravitational Wave Observations
Chan, Man Leong; Hu, Yi-Ming; Messenger, Chris; Hendry, Martin; Heng, Ik Siong
2017-01-01
Estimates of the source sky location for gravitational wave signals are likely to span areas of up to hundreds of square degrees or more, making it very challenging for most telescopes to search for counterpart signals in the electromagnetic spectrum. To boost the chance of successfully observing such counterparts, we have developed an algorithm that optimizes the number of observing fields and their corresponding time allocations by maximizing the detection probability. As a proof-of-concept demonstration, we optimize follow-up observations targeting kilonovae using telescopes including the CTIO-Dark Energy Camera, Subaru-HyperSuprimeCam, Pan-STARRS, and the Palomar Transient Factory. We consider three simulated gravitational wave events with 90% credible error regions spanning areas from ∼ 30 {\\deg }2 to ∼ 300 {\\deg }2. Assuming a source at 200 {Mpc}, we demonstrate that to obtain a maximum detection probability, there is an optimized number of fields for any particular event that a telescope should observe. To inform future telescope design studies, we present the maximum detection probability and corresponding number of observing fields for a combination of limiting magnitudes and fields of view over a range of parameters. We show that for large gravitational wave error regions, telescope sensitivity rather than field of view is the dominating factor in maximizing the detection probability.
Maximum phytoplankton concentrations in the sea
Jackson, G.A.; Kiørboe, Thomas
2008-01-01
A simplification of plankton dynamics using coagulation theory provides predictions of the maximum algal concentration sustainable in aquatic systems. These predictions have previously been tested successfully against results from iron fertilization experiments. We extend the test to data collected...... in the North Atlantic as part of the Bermuda Atlantic Time Series program as well as data collected off Southern California as part of the Southern California Bight Study program. The observed maximum particulate organic carbon and volumetric particle concentrations are consistent with the predictions...
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Grandy, W. T., Jr.; Milonni, P. W.
2004-12-01
Preface; 1. Recollection of an independent thinker Joel A. Snow; 2. A look back: early applications of maximum entropy estimation to quantum statistical mechanics D. J. Scalapino; 3. The Jaynes-Cummings revival B. W. Shore and P. L. Knight; 4. The Jaynes-Cummings model and the one-atom-master H. Walther; 5. The Jaynes-Cummings model is alive and well P. Meystre; 6. Self-consistent radiation reaction in quantum optics - Jaynes' influence and a new example in cavity QED J. H. Eberly; 7. Enhancing the index of refraction in a nonabsorbing medium: phaseonium versus a mixture of two-level atoms M. O. Scully, T. W. Hänsch, M. Fleischhauer, C. H. Keitel and Shi-Yao Zhu; 8. Ed Jaynes' steak dinner problem II Michael D. Crisp; 9. Source theory of vacuum field effects Peter W. Milonni; 10. The natural line shape Edwin A. Power; 11. An operational approach to Schrödinger's cat L. Mandel; 12. The classical limit of an atom C. R. Stroud, Jr.; 13. Mutual radiation reaction in spontaneous emission Richard J. Cook; 14. A model of neutron star dynamics F. W. Cummings; 15. The kinematic origin of complex wave function David Hestenes; 16. On radar target identification C. Ray Smith; 17. On the difference in means G. Larry Bretthorst; 18. Bayesian analysis, model selection and prediction Arnold Zellner and Chung-ki Min; 19. Bayesian numerical analysis John Skilling; 20. Quantum statistical inference R. N. Silver; 21. Application of the maximum entropy principle to nonlinear systems far from equilibrium H. Haken; 22. Nonequilibrium statistical mechanics Baldwin Robertson; 23. A backward look to the future E. T. James; Appendix. Vita and bibliography of Edwin T. Jaynes; Index.
Savage s Concept of Probability
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
Maximum entropy production and the fluctuation theorem
Dewar, R C [Unite EPHYSE, INRA Centre de Bordeaux-Aquitaine, BP 81, 33883 Villenave d' Ornon Cedex (France)
2005-05-27
Recently the author used an information theoretical formulation of non-equilibrium statistical mechanics (MaxEnt) to derive the fluctuation theorem (FT) concerning the probability of second law violating phase-space paths. A less rigorous argument leading to the variational principle of maximum entropy production (MEP) was also given. Here a more rigorous and general mathematical derivation of MEP from MaxEnt is presented, and the relationship between MEP and the FT is thereby clarified. Specifically, it is shown that the FT allows a general orthogonality property of maximum information entropy to be extended to entropy production itself, from which MEP then follows. The new derivation highlights MEP and the FT as generic properties of MaxEnt probability distributions involving anti-symmetric constraints, independently of any physical interpretation. Physically, MEP applies to the entropy production of those macroscopic fluxes that are free to vary under the imposed constraints, and corresponds to selection of the most probable macroscopic flux configuration. In special cases MaxEnt also leads to various upper bound transport principles. The relationship between MaxEnt and previous theories of irreversible processes due to Onsager, Prigogine and Ziegler is also clarified in the light of these results. (letter to the editor)
Maximum Parsimony and the Skewness Test: A Simulation Study of the Limits of Applicability
Määttä, Jussi; Roos, Teemu
2016-01-01
The maximum parsimony (MP) method for inferring phylogenies is widely used, but little is known about its limitations in non-asymptotic situations. This study employs large-scale computations with simulated phylogenetic data to estimate the probability that MP succeeds in finding the true phylogeny for up to twelve taxa and 256 characters. The set of candidate phylogenies are taken to be unrooted binary trees; for each simulated data set, the tree lengths of all (2n − 5)!! candidates are computed to evaluate quantities related to the performance of MP, such as the probability of finding the true phylogeny, the probability that the tree with the shortest length is unique, the probability that the true phylogeny has the shortest tree length, and the expected inverse of the number of trees sharing the shortest length. The tree length distributions are also used to evaluate and extend the skewness test of Hillis for distinguishing between random and phylogenetic data. The results indicate, for example, that the critical point after which MP achieves a success probability of at least 0.9 is roughly around 128 characters. The skewness test is found to perform well on simulated data and the study extends its scope to up to twelve taxa. PMID:27035667
Probability landscapes for integrative genomics
Benecke Arndt
2008-05-01
Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Probability distributions for magnetotellurics
Stodt, John A.
1982-11-01
Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.
RANDOM VARIABLE WITH FUZZY PROBABILITY
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Maximum entropy principle and texture formation
Arminjon, M; Arminjon, Mayeul; Imbault, Didier
2006-01-01
The macro-to-micro transition in a heterogeneous material is envisaged as the selection of a probability distribution by the Principle of Maximum Entropy (MAXENT). The material is made of constituents, e.g. given crystal orientations. Each constituent is itself made of a large number of elementary constituents. The relevant probability is the volume fraction of the elementary constituents that belong to a given constituent and undergo a given stimulus. Assuming only obvious constraints in MAXENT means describing a maximally disordered material. This is proved to have the same average stimulus in each constituent. By adding a constraint in MAXENT, a new model, potentially interesting e.g. for texture prediction, is obtained.
Yang, Hui; Zhu, Xiaoxu; Bai, Wei; Zhao, Yongli; Zhang, Jie; Liu, Zhu; Zhou, Ziguan; Ou, Qinghai
2016-09-01
Virtualization is considered to be a promising solution to support various emerging applications. This paper illustrates the problem of virtual mapping from a new perspective, and mainly focuses on survivable mapping of virtual networks and the potential trade-off between spectral resource usage effectiveness and failure resilience level. We design an optimum shared protection mapping (OSPM) scheme in elastic optical networks. A differentiable maximum shared capacity of each frequency slot is defined to more efficiently shared protection resource. In order to satisfy various assessment standards, a metric called ambiguity similitude is defined for the first time to give insight on the optimizing difficulty. Simulation results are presented to compare the outcome of the novel OSPM algorithm with traditional dedicated link protection and maximum shared protection mapping. By synthetic analysis, OSPM outperforms the other two schemes in terms of striking a perfect balance among blocking probability, resources utilization, protective success rate, and spectrum redundancy.
Regions of constrained maximum likelihood parameter identifiability
Lee, C.-H.; Herget, C. J.
1975-01-01
This paper considers the parameter identification problem of general discrete-time, nonlinear, multiple-input/multiple-output dynamic systems with Gaussian-white distributed measurement errors. Knowledge of the system parameterization is assumed to be known. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems. It is shown that if the vector of true parameters is locally CML identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the CML estimation sequence will converge to the true parameters.
CORA: Emission Line Fitting with Maximum Likelihood
Ness, Jan-Uwe; Wichmann, Rainer
2011-12-01
CORA analyzes emission line spectra with low count numbers and fits them to a line using the maximum likelihood technique. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise, the software derives the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. CORA has been applied to an X-ray spectrum with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Conditionals, probability, and belief revision
Voorbraak, F.
1989-01-01
A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called
Wu Fuxian; Wen Weidong
2016-01-01
Classic maximum entropy quantile function method (CMEQFM) based on the probabil-ity weighted moments (PWMs) can accurately estimate the quantile function of random variable on small samples, but inaccurately on the very small samples. To overcome this weakness, least square maximum entropy quantile function method (LSMEQFM) and that with constraint condition (LSMEQFMCC) are proposed. To improve the confidence level of quantile function estimation, scatter factor method is combined with maximum entropy method to estimate the confidence inter-val of quantile function. From the comparisons of these methods about two common probability distributions and one engineering application, it is showed that CMEQFM can estimate the quan-tile function accurately on the small samples but inaccurately on the very small samples (10 sam-ples); LSMEQFM and LSMEQFMCC can be successfully applied to the very small samples;with consideration of the constraint condition on quantile function, LSMEQFMCC is more stable and computationally accurate than LSMEQFM; scatter factor confidence interval estimation method based on LSMEQFM or LSMEQFMCC has good estimation accuracy on the confidence interval of quantile function, and that based on LSMEQFMCC is the most stable and accurate method on the very small samples (10 samples).
Calculation Model and Simulation of Warship Damage Probability
TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping
2008-01-01
The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.
Wolf Attack Probability: A Theoretical Security Measure in Biometric Authentication Systems
Une, Masashi; Otsuka, Akira; Imai, Hideki
This paper will propose a wolf attack probability (WAP) as a new measure for evaluating security of biometric authentication systems. The wolf attack is an attempt to impersonate a victim by feeding “wolves” into the system to be attacked. The “wolf” means an input value which can be falsely accepted as a match with multiple templates. WAP is defined as a maximum success probability of the wolf attack with one wolf sample. In this paper, we give a rigorous definition of the new security measure which gives strength estimation of an individual biometric authentication system against impersonation attacks. We show that if one reestimates using our WAP measure, a typical fingerprint algorithm turns out to be much weaker than theoretically estimated by Ratha et al. Moreover, we apply the wolf attack to a finger-vein-pattern based algorithm. Surprisingly, we show that there exists an extremely strong wolf which falsely matches all templates for any threshold value.
Maximising the detection probability of kilonovae associated with gravitational wave observations
Chan, Man Leong; Messenger, Chris; Hendry, Martin; Heng, Ik Siong
2015-01-01
The source sky location estimate for the first detected gravitational wave signals will likely be poor, typically spanning areas more than hundreds of square degrees. It is an enormous task for most telescopes to search such large sky regions for counterpart signals in the electromagnetic spectrum. To maximise the chance of successfully observing the desired counterpart signal, we have developed an algorithm which maximises the detection probability by optimising the number of observing fields, and the time allocation for those fields. As a proof-of-concept demonstration, we use the algorithm to optimise the follow-up observations of the Palomar Transient Factory for a simulated gravitational wave event. We show that the optimal numbers for the Palomar Transient Factory are $24$ and $68$ for observation times $1800s$ and $5400s$ respectively, with a maximum detection probability about $65\\%$ for a kilonova at $200 Mpc$.
The Art of Probability Assignment
Dimitrov, Vesselin I
2012-01-01
The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Maximum Likelihood Analysis in the PEN Experiment
Lehman, Martin
2013-10-01
The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.
Laboratory-tutorial activities for teaching probability
Roger E. Feeley
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Laboratory-tutorial activities for teaching probability
Michael C. Wittmann
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Maximum Likelihood Under Response Biased Sampling\\ud
Chambers, Raymond; Dorfman, Alan; Wang, Suojin
2003-01-01
Informative sampling occurs when the probability of inclusion in sample depends on\\ud the value of the survey response variable. Response or size biased sampling is a\\ud particular case of informative sampling where the inclusion probability is proportional\\ud to the value of this variable. In this paper we describe a general model for response\\ud biased sampling, which we call array sampling, and develop maximum likelihood and\\ud estimating equation theory appropriate to this situation. The ...
Exponential convergence rates for weighted sums in noncommutative probability space
Choi, Byoung Jin; Ji, Un Cig
2016-11-01
We study exponential convergence rates for weighted sums of successive independent random variables in a noncommutative probability space of which the weights are in a von Neumann algebra. Then we prove a noncommutative extension of the result for the exponential convergence rate by Baum, Katz and Read. As applications, we first study a large deviation type inequality for weighted sums in a noncommutative probability space, and secondly we study exponential convergence rates for weighted free additive convolution sums of probability measures.
Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios
Okjeong Lee
2016-01-01
Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.
Hidden Variables or Positive Probabilities?
Rothman, T; Rothman, Tony
2001-01-01
Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Maximum likelihood pedigree reconstruction using integer linear programming.
Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A
2013-01-01
Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible.
OECD Maximum Residue Limit Calculator
With the goal of harmonizing the calculation of maximum residue limits (MRLs) across the Organisation for Economic Cooperation and Development, the OECD has developed an MRL Calculator. View the calculator.
Understanding Students' Beliefs about Probability.
Konold, Clifford
The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…
Expected utility with lower probabilities
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Varieties of Belief and Probability
D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik
2015-01-01
htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for
Landau-Zener Probability Reviewed
Valencia, C
2008-01-01
We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.
Probability and Statistics: 5 Questions
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Maximum Likelihood Estimation of the Identification Parameters and Its Correction
无
2002-01-01
By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of theidentification parameters is given. In order to decrease the asymptotic error, a corrector of maximum likelihood (CML)estimation with its recursive algorithm is given. It has been proved that the corrector has smaller asymptotic error thanthe least square methods. A simulation example shows that the corrector of maximum likelihood estimation is of higherapproximating precision to the true parameters than the least square methods.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Linear Positivity and Virtual Probability
Hartle, J B
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...
Survival probability and ruin probability of a risk model
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
The Analysis of Insulation Breakdown Probabilities by the Up-And-Down Method
Vibholm (fratrådt), Svend; Thyregod, Poul
1986-01-01
This paper discusses the assessment of breakdown probability by means of the up-and-down method. The Dixon and Mood approximation to the maximum-likelihood estimate is compared with the exact maximum-likelihood estimate for a number of response patterns. Estimates of the 50D probability breakdown...
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Exploiting the Maximum Entropy Principle to Increase Retrieval Effectiveness.
Cooper, William S.
1983-01-01
Presents information retrieval design approach in which queries of computer-based system consist of sets of terms, either unweighted or weighted with subjective term precision estimates, and retrieval outputs ranked by probability of usefulness estimated by "maximum entropy principle." Boolean and weighted request systems are discussed.…
On the maximum entropy principle in non-extensive thermostatistics
Naudts, Jan
2004-01-01
It is possible to derive the maximum entropy principle from thermodynamic stability requirements. Using as a starting point the equilibrium probability distribution, currently used in non-extensive thermostatistics, it turns out that the relevant entropy function is Renyi's alpha-entropy, and not Tsallis' entropy.
Maximum margin Bayesian network classifiers.
Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian
2012-03-01
We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Maximum Information and Quantum Prediction Algorithms
McElwaine, J N
1997-01-01
This paper describes an algorithm for selecting a consistent set within the consistent histories approach to quantum mechanics and investigates its properties. The algorithm uses a maximum information principle to select from among the consistent sets formed by projections defined by the Schmidt decomposition. The algorithm unconditionally predicts the possible events in closed quantum systems and ascribes probabilities to these events. A simple spin model is described and a complete classification of all exactly consistent sets of histories formed from Schmidt projections in the model is proved. This result is used to show that for this example the algorithm selects a physically realistic set. Other tentative suggestions in the literature for set selection algorithms using ideas from information theory are discussed.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Probability Ranking in Vector Spaces
Melucci, Massimo
2011-01-01
The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.
Holographic probabilities in eternal inflation.
Bousso, Raphael
2006-11-10
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Local Causality, Probability and Explanation
Healey, Richard A
2016-01-01
In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Stationary properties of maximum-entropy random walks.
Dixit, Purushottam D
2015-10-01
Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.
Quantifying extrinsic noise in gene expression using the maximum entropy framework.
Dixit, Purushottam D
2013-06-18
We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed.
Diurnal distribution of sunshine probability
Aydinli, S.
1982-01-01
The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.
Probability representation of classical states
Man'ko, OV; Man'ko, [No Value; Pilyavets, OV
2005-01-01
Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
The probabilities of unique events.
Sangeet S Khemlani
Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Three lectures on free probability
2012-01-01
These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.
Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.
2008-01-01
Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize
Unturbe, Jesús; Corominas, Josep
2007-09-01
Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.
Greenslade, Thomas B., Jr.
1985-01-01
Discusses a series of experiments performed by Thomas Hope in 1805 which show the temperature at which water has its maximum density. Early data cast into a modern form as well as guidelines and recent data collected from the author provide background for duplicating Hope's experiments in the classroom. (JN)
Abolishing the maximum tension principle
Dabrowski, Mariusz P
2015-01-01
We find the series of example theories for which the relativistic limit of maximum tension $F_{max} = c^2/4G$ represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Abolishing the maximum tension principle
Mariusz P. Da̧browski
2015-09-01
Full Text Available We find the series of example theories for which the relativistic limit of maximum tension Fmax=c4/4G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Establishment probability in newly founded populations
Gusset Markus
2012-06-01
Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Bülow, Morten Hillgaard; Söderqvist, Thomas
2014-01-01
Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....
Bülow, Morten Hillgaard; Söderqvist, Thomas
2014-01-01
Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....
Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis
2012-01-01
This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history....... Consistent with our expectations, we find that full professors, authors appointed at economics and history departments, and authors working in Anglo-Saxon and German countries are more likely to receive citations than other scholars. Long and co-authored articles are also a factor for citation success. We...... find similar patterns when assessing the same authors' citation success in economics journals. As a novel feature, we demonstrate that the diffusion of research — publication of working papers, as well as conference and workshop presentations — has a first-order positive impact on the citation rate....
STOCHASTIC ANALYSIS OF RANDOM AD HOC NETWORKS WITH MAXIMUM ENTROPY DEPLOYMENTS
Thomas Bourgeois
2014-10-01
Full Text Available In this paper, we present the first stochastic analysis of the link performance of an ad hoc network modelled by a single homogeneous Poisson point process (HPPP. According to the maximum entropy principle, the single HPPP model is mathematically the best model for random deployments with a given node density. However, previous works in the literature only consider a modified model which shows a discrepancy in the interference distribution with the more suitable single HPPP model. The main contributions of this paper are as follows. 1 It presents a new mathematical framework leading to closed form expressions of the probability of success of both one-way transmissions and handshakes for a deployment modelled by a single HPPP. Our approach, based on stochastic geometry, can be extended to complex protocols. 2 From the obtained results, all confirmed by comparison to simulated data, optimal PHY and MAC layer parameters are determined and the relations between them is described in details. 3 The influence of the routing protocol on handshake performance is taken into account in a realistic manner, leading to the confirmation of the intuitive result that the effect of imperfect feedback on the probability of success of a handshake is only negligible for transmissions to the first neighbour node.
Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis
affects citations. In regard to author-specific characteristics, male authors, full professors and authors working economics or history departments, and authors employed in Anglo-Saxon countries, are more likely to get cited than others. As a ‘shortcut' to citation success, we find that research diffusion......This study analyses determinants of citation success among authors publishing in economic history journals. Bibliometric features, like article length and number of authors, are positively correlated with the citation rate up to a certain point. Remarkably, publishing in top-ranked journals hardly...
Maximum Genus of Strong Embeddings
Er-ling Wei; Yan-pei Liu; Han Ren
2003-01-01
The strong embedding conjecture states that any 2-connected graph has a strong embedding on some surface. It implies the circuit double cover conjecture: Any 2-connected graph has a circuit double cover.Conversely, it is not true. But for a 3-regular graph, the two conjectures are equivalent. In this paper, a characterization of graphs having a strong embedding with exactly 3 faces, which is the strong embedding of maximum genus, is given. In addition, some graphs with the property are provided. More generally, an upper bound of the maximum genus of strong embeddings of a graph is presented too. Lastly, it is shown that the interpolation theorem is true to planar Halin graph.
Cluster Membership Probability: Polarimetric Approach
Medhi, Biman J
2013-01-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Alternative Multiview Maximum Entropy Discrimination.
Chao, Guoqing; Sun, Shiliang
2016-07-01
Maximum entropy discrimination (MED) is a general framework for discriminative estimation based on maximum entropy and maximum margin principles, and can produce hard-margin support vector machines under some assumptions. Recently, the multiview version of MED multiview MED (MVMED) was proposed. In this paper, we try to explore a more natural MVMED framework by assuming two separate distributions p1( Θ1) over the first-view classifier parameter Θ1 and p2( Θ2) over the second-view classifier parameter Θ2 . We name the new MVMED framework as alternative MVMED (AMVMED), which enforces the posteriors of two view margins to be equal. The proposed AMVMED is more flexible than the existing MVMED, because compared with MVMED, which optimizes one relative entropy, AMVMED assigns one relative entropy term to each of the two views, thus incorporating a tradeoff between the two views. We give the detailed solving procedure, which can be divided into two steps. The first step is solving our optimization problem without considering the equal margin posteriors from two views, and then, in the second step, we consider the equal posteriors. Experimental results on multiple real-world data sets verify the effectiveness of the AMVMED, and comparisons with MVMED are also reported.
Probability Model of Hangzhou Bay Bridge Vehicle Loads Using Weigh-in-Motion Data
Dezhang Sun
2015-01-01
Full Text Available To study the vehicle load characteristics of bay bridges in China, especially truck loads, we performed a statistical analysis of the vehicle loads on Hangzhou Bay Bridge using more than 3 months of weigh-in-motion data from the site. The results showed that when all the vehicle samples were included in the statistical analysis, the histogram of the vehicles exhibited a multimodal distribution, which could not be fitted successfully by a familiar single probability distribution model. When the truck samples were analyzed, a characteristic multiple-peaked distribution with a main peak was obtained. The probability distribution of all vehicles was fitted using a weighting function with five normal distributions and the truck loads were modeled by a single normal distribution. The results demonstrated the good fits with the histogram. The histograms of different time periods were also analyzed. The results showed that the traffic mainly comprised two-axle small vehicles during the rush hours in the morning and the evening, and the histogram could be fitted approximately using three normal distribution functions. And the maximum value distributions of vehicles during the design life of the bay bridge were predicted by maximum value theory.
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis
This study analyses determinants of citation success among authors publishing in economic history journals. Bibliometric features, like article length and number of authors, are positively correlated with the citation rate up to a certain point. Remarkably, publishing in top-ranked journals hardl...
Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis
2012-01-01
This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history. Co...
Kusumastuti, Sasmita; Derks, Marloes G. M.; Tellier, Siri;
2016-01-01
. METHODS: We performed a novel, hypothesis-free and quantitative analysis of citation networks exploring the literature on successful ageing that exists in the Web of Science Core Collection Database using the CitNetExplorer software. Outcomes were visualized using timeline-based citation patterns...
Combining Experiments and Simulations Using the Maximum Entropy Principle
Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten
2014-01-01
are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...... in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results....... Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges....
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Innovation and social probable knowledge
Marco Crocco
2000-01-01
In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...
Knowledge typology for imprecise probabilities.
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Modelling streambank erosion potential using maximum entropy in a central Appalachian watershed
Pitchford, J.; Strager, M.; Riley, A.; Lin, L.; Anderson, J.
2015-03-01
We used maximum entropy to model streambank erosion potential (SEP) in a central Appalachian watershed to help prioritize sites for management. Model development included measuring erosion rates, application of a quantitative approach to locate Target Eroding Areas (TEAs), and creation of maps of boundary conditions. We successfully constructed a probability distribution of TEAs using the program Maxent. All model evaluation procedures indicated that the model was an excellent predictor, and that the major environmental variables controlling these processes were streambank slope, soil characteristics, bank position, and underlying geology. A classification scheme with low, moderate, and high levels of SEP derived from logistic model output was able to differentiate sites with low erosion potential from sites with moderate and high erosion potential. A major application of this type of modelling framework is to address uncertainty in stream restoration planning, ultimately helping to bridge the gap between restoration science and practice.
SureTrak Probability of Impact Display
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
The special status of mathematical probability: a historical sketch
De Scheemaekere, Xavier; Szafarz, Ariane
2008-01-01
The history of the mathematical probability includes two phases: 1) From Pascal and Fermat to Laplace, the theory gained in application fields; 2) In the first half of the 20th Century, two competing axiomatic systems were respectively proposed by von Mises in 1919 and Kolmogorov in 1933. This paper places this historical sketch in the context of the philosophical complexity of the probability concept and explains the resounding success of Kolmogorov’s theory through its ability to avoid dire...
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Fuzzy Markov chains: uncertain probabilities
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
DECOFF Probabilities of Failed Operations
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability representations of fuzzy systems
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
Cacti with maximum Kirchhoff index
Wang, Wen-Rui; Pan, Xiang-Feng
2015-01-01
The concept of resistance distance was first proposed by Klein and Randi\\'c. The Kirchhoff index $Kf(G)$ of a graph $G$ is the sum of resistance distance between all pairs of vertices in $G$. A connected graph $G$ is called a cactus if each block of $G$ is either an edge or a cycle. Let $Cat(n;t)$ be the set of connected cacti possessing $n$ vertices and $t$ cycles, where $0\\leq t \\leq \\lfloor\\frac{n-1}{2}\\rfloor$. In this paper, the maximum kirchhoff index of cacti are characterized, as well...
Generic maximum likely scale selection
Pedersen, Kim Steenstrup; Loog, Marco; Markussen, Bo
2007-01-01
The fundamental problem of local scale selection is addressed by means of a novel principle, which is based on maximum likelihood estimation. The principle is generally applicable to a broad variety of image models and descriptors, and provides a generic scale estimation methodology. The focus...... on second order moments of multiple measurements outputs at a fixed location. These measurements, which reflect local image structure, consist in the cases considered here of Gaussian derivatives taken at several scales and/or having different derivative orders....
OIL MONITORING DIAGNOSTIC CRITERIONS BASED ON MAXIMUM ENTROPY PRINCIPLE
Huo Hua; Li Zhuguo; Xia Yanchun
2005-01-01
A method of applying maximum entropy probability density estimation approach to constituting diagnostic criterions of oil monitoring data is presented. The method promotes the precision of diagnostic criterions for evaluating the wear state of mechanical facilities, and judging abnormal data. According to the critical boundary points defined, a new measure on monitoring wear state and identifying probable wear faults can be got. The method can be applied to spectrometric analysis and direct reading ferrographic analysis. On the basis of the analysis and discussion of two examples of 8NVD48A-2U diesel engines, the practicality is proved to be an effective method in oil monitoring.
Volcano shapes, entropies, and eruption probabilities
Gudmundsson, Agust; Mohajeri, Nahid
2014-05-01
assess the probability of eruptions in relation to the shape of the volcano. These methods can also be applied to the probability of injected dykes reaching the surface in a volcano. We show how the thickness distributions of dykes can be used to estimated their height (dip-dimension) distributions and, for a given magma source and volcano geometry, their probability of erupting. From the calculated energy (mainly elastic and thermal) of the host volcano, and other constraints, the maximum-entropy method can be used to improve the reliability of the assessment of the likelihood of eruption during an unrest period. Becerril, L., Galindo, I., Gudmundsson, A., Morales, J.M., 2013. Depth of origin of magma in eruptions. Sci. Rep., 3 : 2762, doi: 10.1038/srep02762 Gudmundsson, A., 2012 Strengths and strain energies of volcanic edifices: implications for eruptions, collapse calderas, and landslides. Nat. Hazards Earth Syst. Sci., 12, 2241-2258. Gudmundsson, A., Mohajeri, N., 2013. Relations between the scaling exponents, entropies, and energies of fracture networks. Bull. Geol. Soc. France, 184, 377-387. Mohajeri, N., Gudmundsson, A., 2012. Entropies and scaling exponents of street and fracture networks. Entropy, 14, 800-833.
"Success"ful Reading Instruction.
George, Carol J.
1986-01-01
The Success in Reading and Writing Program at a K-2 school in Fort Jackson, South Carolina, teaches children of varied races and abilities to read and write using newspapers, dictionaries, library books, magazines, and telephone directories. These materials help students develop language skills in a failure-free atmosphere. Includes two…
Maximum process problems in optimal control theory
Goran Peskir
2005-01-01
Full Text Available Given a standard Brownian motion (Btt≥0 and the equation of motion dXt=vtdt+2dBt, we set St=max0≤s≤tXs and consider the optimal control problem supvE(Sτ−Cτ, where c>0 and the supremum is taken over all admissible controls v satisfying vt∈[μ0,μ1] for all t up to τ=inf{t>0|Xt∉(ℓ0,ℓ1} with μ0g∗(St, where s↦g∗(s is a switching curve that is determined explicitly (as the unique solution to a nonlinear differential equation. The solution found demonstrates that the problem formulations based on a maximum functional can be successfully included in optimal control theory (calculus of variations in addition to the classic problem formulations due to Lagrange, Mayer, and Bolza.
Video segmentation using Maximum Entropy Model
QIN Li-juan; ZHUANG Yue-ting; PAN Yun-he; WU Fei
2005-01-01
Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance.Most current approaches only focus on discriminating moving objects by background subtraction whether or not the objects of interest can be moving or stationary. In this paper, we propose layers segmentation to detect both moving and stationary target objects from surveillance video. We extend the Maximum Entropy (ME) statistical model to segment layers with features, which are collected by constructing a codebook with a set of codewords for each pixel. We also indicate how the training models are used for the discrimination of target objects in surveillance video. Our experimental results are presented in terms of the success rate and the segmenting precision.
Maximum-entropy distributions of correlated variables with prespecified marginals.
Larralde, Hernán
2012-12-01
The problem of determining the joint probability distributions for correlated random variables with prespecified marginals is considered. When the joint distribution satisfying all the required conditions is not unique, the "most unbiased" choice corresponds to the distribution of maximum entropy. The calculation of the maximum-entropy distribution requires the solution of rather complicated nonlinear coupled integral equations, exact solutions to which are obtained for the case of Gaussian marginals; otherwise, the solution can be expressed as a perturbation around the product of the marginals if the marginal moments exist.
Parameter estimation in X-ray astronomy using maximum likelihood
Wachter, K.; Leach, R.; Kellogg, E.
1979-01-01
Methods of estimation of parameter values and confidence regions by maximum likelihood and Fisher efficient scores starting from Poisson probabilities are developed for the nonlinear spectral functions commonly encountered in X-ray astronomy. It is argued that these methods offer significant advantages over the commonly used alternatives called minimum chi-squared because they rely on less pervasive statistical approximations and so may be expected to remain valid for data of poorer quality. Extensive numerical simulations of the maximum likelihood method are reported which verify that the best-fit parameter value and confidence region calculations are correct over a wide range of input spectra.
Economics and Maximum Entropy Production
Lorenz, R. D.
2003-04-01
Price differentials, sales volume and profit can be seen as analogues of temperature difference, heat flow and work or entropy production in the climate system. One aspect in which economic systems exhibit more clarity than the climate is that the empirical and/or statistical mechanical tendency for systems to seek a maximum in production is very evident in economics, in that the profit motive is very clear. Noting the common link between 1/f noise, power laws and Self-Organized Criticality with Maximum Entropy Production, the power law fluctuations in security and commodity prices is not inconsistent with the analogy. There is an additional thermodynamic analogy, in that scarcity is valued. A commodity concentrated among a few traders is valued highly by the many who do not have it. The market therefore encourages via prices the spreading of those goods among a wider group, just as heat tends to diffuse, increasing entropy. I explore some empirical price-volume relationships of metals and meteorites in this context.
Origin of probabilities and their application to the multiverse
Albrecht, Andreas; Phillips, Daniel
2014-12-01
We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.
Walker, H. F.
1976-01-01
Likelihood equations determined by the two types of samples which are necessary conditions for a maximum-likelihood estimate are considered. These equations, suggest certain successive-approximations iterative procedures for obtaining maximum-likelihood estimates. These are generalized steepest ascent (deflected gradient) procedures. It is shown that, with probability 1 as N sub 0 approaches infinity (regardless of the relative sizes of N sub 0 and N sub 1, i=1,...,m), these procedures converge locally to the strongly consistent maximum-likelihood estimates whenever the step size is between 0 and 2. Furthermore, the value of the step size which yields optimal local convergence rates is bounded from below by a number which always lies between 1 and 2.
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Resolution of overlapping ambiguity strings based on maximum entropy model
ZHANG Feng; FAN Xiao-zhong
2006-01-01
The resolution of overlapping ambiguity strings (OAS) is studied based on the maximum entropy model.There are two model outputs,where either the first two characters form a word or the last two characters form a word.The features of the model include one word in context of OAS,the current OAS and word probability relation of two kinds of segmentation results.OAS in training text is found by the combination of the FMM and BMM segmentation method.After feature tagging they are used to train the maximum entropy model.The People Daily corpus of January 1998 is used in training and testing.Experimental results show a closed test precision of 98.64% and an open test precision of 95.01%.The open test precision is 3,76% better compared with that of the precision of common word probability method.
Probability biases as Bayesian inference
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Cluster pre-existence probability
Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)
2011-10-15
Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Knot probabilities in random diagrams
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Logic, Probability, and Human Reasoning
2015-01-01
3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ
Probability and statistics: A reminder
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Objects of maximum electromagnetic chirality
Fernandez-Corbaton, Ivan
2015-01-01
We introduce a definition of the electromagnetic chirality of an object and show that it has an upper bound. The upper bound is attained if and only if the object is transparent for fields of one handedness (helicity). Additionally, electromagnetic duality symmetry, i.e. helicity preservation upon scattering, turns out to be a necessary condition for reciprocal scatterers to attain the upper bound. We use these results to provide requirements for the design of such extremal scatterers. The requirements can be formulated as constraints on the polarizability tensors for dipolar scatterers or as material constitutive relations. We also outline two applications for objects of maximum electromagnetic chirality: A twofold resonantly enhanced and background free circular dichroism measurement setup, and angle independent helicity filtering glasses.
Maximum mutual information regularized classification
Wang, Jim Jing-Yan
2014-09-07
In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.
The strong maximum principle revisited
Pucci, Patrizia; Serrin, James
In this paper we first present the classical maximum principle due to E. Hopf, together with an extended commentary and discussion of Hopf's paper. We emphasize the comparison technique invented by Hopf to prove this principle, which has since become a main mathematical tool for the study of second order elliptic partial differential equations and has generated an enormous number of important applications. While Hopf's principle is generally understood to apply to linear equations, it is in fact also crucial in nonlinear theories, such as those under consideration here. In particular, we shall treat and discuss recent generalizations of the strong maximum principle, and also the compact support principle, for the case of singular quasilinear elliptic differential inequalities, under generally weak assumptions on the quasilinear operators and the nonlinearities involved. Our principal interest is in necessary and sufficient conditions for the validity of both principles; in exposing and simplifying earlier proofs of corresponding results; and in extending the conclusions to wider classes of singular operators than previously considered. The results have unexpected ramifications for other problems, as will develop from the exposition, e.g. two point boundary value problems for singular quasilinear ordinary differential equations (Sections 3 and 4); the exterior Dirichlet boundary value problem (Section 5); the existence of dead cores and compact support solutions, i.e. dead cores at infinity (Section 7); Euler-Lagrange inequalities on a Riemannian manifold (Section 9); comparison and uniqueness theorems for solutions of singular quasilinear differential inequalities (Section 10). The case of p-regular elliptic inequalities is briefly considered in Section 11.
Objective probability and quantum fuzziness
Mohrhoff, U
2007-01-01
This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...
Maximum-entropy principle as Galerkin modelling paradigm
Noack, Bernd R.; Niven, Robert K.; Rowley, Clarence W.
2012-11-01
We show how the empirical Galerkin method, leading e.g. to POD models, can be derived from maximum-entropy principles building on Noack & Niven 2012 JFM. In particular, principles are proposed (1) for the Galerkin expansion, (2) for the Galerkin system identification, and (3) for the probability distribution of the attractor. Examples will illustrate the advantages of the entropic modelling paradigm. Partially supported by the ANR Chair of Excellence TUCOROM and an ADFA/UNSW Visiting Fellowship.
Empirical and Computational Tsunami Probability
Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.
2008-12-01
A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical
Maximum entropy production and plant optimization theories.
Dewar, Roderick C
2010-05-12
Plant ecologists have proposed a variety of optimization theories to explain the adaptive behaviour and evolution of plants from the perspective of natural selection ('survival of the fittest'). Optimization theories identify some objective function--such as shoot or canopy photosynthesis, or growth rate--which is maximized with respect to one or more plant functional traits. However, the link between these objective functions and individual plant fitness is seldom quantified and there remains some uncertainty about the most appropriate choice of objective function to use. Here, plants are viewed from an alternative thermodynamic perspective, as members of a wider class of non-equilibrium systems for which maximum entropy production (MEP) has been proposed as a common theoretical principle. I show how MEP unifies different plant optimization theories that have been proposed previously on the basis of ad hoc measures of individual fitness--the different objective functions of these theories emerge as examples of entropy production on different spatio-temporal scales. The proposed statistical explanation of MEP, that states of MEP are by far the most probable ones, suggests a new and extended paradigm for biological evolution--'survival of the likeliest'--which applies from biomacromolecules to ecosystems, not just to individuals.
Maximum likelihood continuity mapping for fraud detection
Hogden, J.
1997-05-01
The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.
Adaptive Statistical Language Modeling; A Maximum Entropy Approach
1994-04-19
recognition systems were built that could recognize vowels or digits, but they could not be successfully extended to handle more realistic language...maximum likelihood of gener- ating the training data. The identity of the ML and ME solutions, apart from being aesthetically pleasing, is extremely
Maximum caliber inference and the stochastic Ising model
Cafaro, Carlo; Ali, Sean Alan
2016-11-01
We investigate the maximum caliber variational principle as an inference algorithm used to predict dynamical properties of complex nonequilibrium, stationary, statistical systems in the presence of incomplete information. Specifically, we maximize the path entropy over discrete time step trajectories subject to normalization, stationarity, and detailed balance constraints together with a path-dependent dynamical information constraint reflecting a given average global behavior of the complex system. A general expression for the transition probability values associated with the stationary random Markov processes describing the nonequilibrium stationary system is computed. By virtue of our analysis, we uncover that a convenient choice of the dynamical information constraint together with a perturbative asymptotic expansion with respect to its corresponding Lagrange multiplier of the general expression for the transition probability leads to a formal overlap with the well-known Glauber hyperbolic tangent rule for the transition probability for the stochastic Ising model in the limit of very high temperatures of the heat reservoir.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Recursive estimation of prior probabilities using the mixture approach
Kazakos, D.
1974-01-01
The problem of estimating the prior probabilities q sub k of a mixture of known density functions f sub k(X), based on a sequence of N statistically independent observations is considered. It is shown that for very mild restrictions on f sub k(X), the maximum likelihood estimate of Q is asymptotically efficient. A recursive algorithm for estimating Q is proposed, analyzed, and optimized. For the M = 2 case, it is possible for the recursive algorithm to achieve the same performance with the maximum likelihood one. For M 2, slightly inferior performance is the price for having a recursive algorithm. However, the loss is computable and tolerable.
Maximum entropy production in daisyworld
Maunu, Haley A.; Knuth, Kevin H.
2012-05-01
Daisyworld was first introduced in 1983 by Watson and Lovelock as a model that illustrates how life can influence a planet's climate. These models typically involve modeling a planetary surface on which black and white daisies can grow thus influencing the local surface albedo and therefore also the temperature distribution. Since then, variations of daisyworld have been applied to study problems ranging from ecological systems to global climate. Much of the interest in daisyworld models is due to the fact that they enable one to study self-regulating systems. These models are nonlinear, and as such they exhibit sensitive dependence on initial conditions, and depending on the specifics of the model they can also exhibit feedback loops, oscillations, and chaotic behavior. Many daisyworld models are thermodynamic in nature in that they rely on heat flux and temperature gradients. However, what is not well-known is whether, or even why, a daisyworld model might settle into a maximum entropy production (MEP) state. With the aim to better understand these systems, this paper will discuss what is known about the role of MEP in daisyworld models.
Maximum stellar iron core mass
F W Giacobbe
2003-03-01
An analytical method of estimating the mass of a stellar iron core, just prior to core collapse, is described in this paper. The method employed depends, in part, upon an estimate of the true relativistic mass increase experienced by electrons within a highly compressed iron core, just prior to core collapse, and is signiﬁcantly different from a more typical Chandrasekhar mass limit approach. This technique produced a maximum stellar iron core mass value of 2.69 × 1030 kg (1.35 solar masses). This mass value is very near to the typical mass values found for neutron stars in a recent survey of actual neutron star masses. Although slightly lower and higher neutron star masses may also be found, lower mass neutron stars are believed to be formed as a result of enhanced iron core compression due to the weight of non-ferrous matter overlying the iron cores within large stars. And, higher mass neutron stars are likely to be formed as a result of fallback or accretion of additional matter after an initial collapse event involving an iron core having a mass no greater than 2.69 × 1030 kg.
Maximum Matchings via Glauber Dynamics
Jindal, Anant; Pal, Manjish
2011-01-01
In this paper we study the classic problem of computing a maximum cardinality matching in general graphs $G = (V, E)$. The best known algorithm for this problem till date runs in $O(m \\sqrt{n})$ time due to Micali and Vazirani \\cite{MV80}. Even for general bipartite graphs this is the best known running time (the algorithm of Karp and Hopcroft \\cite{HK73} also achieves this bound). For regular bipartite graphs one can achieve an $O(m)$ time algorithm which, following a series of papers, has been recently improved to $O(n \\log n)$ by Goel, Kapralov and Khanna (STOC 2010) \\cite{GKK10}. In this paper we present a randomized algorithm based on the Markov Chain Monte Carlo paradigm which runs in $O(m \\log^2 n)$ time, thereby obtaining a significant improvement over \\cite{MV80}. We use a Markov chain similar to the \\emph{hard-core model} for Glauber Dynamics with \\emph{fugacity} parameter $\\lambda$, which is used to sample independent sets in a graph from the Gibbs Distribution \\cite{V99}, to design a faster algori...
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.
PROBABILITY MODEL OF GUNTHER GENERATOR
无
2007-01-01
This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
2011-01-10
...: Establishing Maximum Allowable Operating Pressure or Maximum Operating Pressure Using Record Evidence, and... facilities of their responsibilities, under Federal integrity management (IM) regulations, to perform... system, especially when calculating Maximum Allowable Operating Pressure (MAOP) or Maximum Operating...
Changes in context and perception of maximum reaching height.
Wagman, Jeffrey B; Day, Brian M
2014-01-01
Successfully performing a given behavior requires flexibility in both perception and behavior. In particular, doing so requires perceiving whether that behavior is possible across the variety of contexts in which it might be performed. Three experiments investigated how (changes in) context (ie point of observation and intended reaching task) influenced perception of maximum reaching height. The results of experiment 1 showed that perceived maximum reaching height more closely reflected actual reaching ability when perceivers occupied a point of observation that was compatible with that required for the reaching task. The results of experiments 2 and 3 showed that practice perceiving maximum reaching height from a given point of observation improved perception of maximum reaching height from a different point of observation, regardless of whether such practice occurred at a compatible or incompatible point of observation. In general, such findings show bounded flexibility in perception of affordances and are thus consistent with a description of perceptual systems as smart perceptual devices.
Maximum path information and the principle of least action for chaotic system
Wang, Qiuping A.
2004-01-01
ps file, 10 Pages; A path information is defined in connection with the different possible paths of chaotic system moving in its phase space between two cells. On the basis of the assumption that the paths are differentiated by their actions, we show that the maximum path information leads to a path probability distribution as a function of action from which the well known transition probability of Brownian motion can be easily derived. An interesting result is that the most probable paths ar...
Thermospheric density model biases at the 23rd sunspot maximum
Pardini, C.; Moe, K.; Anselmo, L.
2012-07-01
statistically significant. The minimum average biases were obtained with JB2008, NRLMSISE-00 and GOST-2004. Above 500 km, where only one satellite was analyzed (at 630 km), and errors tend to increase with altitude, it cannot be asserted that the calculated biases are significant. Nevertheless, they are presented to show how the various models diverge at higher altitudes. Around 630 km, NRLMSISE-00 had a negligible average bias, while the other models underestimated (GOST-2004) or overestimated the average density, by amounts varying between 6% and 16%. However, in terms of semi-major axis root mean square residuals, JB2006 and JB2008 were the best in any case. Below 500 km, the short-term behavior of the models was also investigated by fitting the semi-major axis decay over 30-day arcs. The resulting fitted drag coefficients displayed a significant variability, probably associated with mismodeled density variations, but JB2008, followed by JB2006, provided the smallest semi-major axis residuals and a reduced short-term variability of the density bias at just a few frequencies, having been probably successful in removing a significant fraction of the mismodeling sources.
Archery: Success through Classroom Instruction.
Hensley, Ralph W.
1982-01-01
For maximum early success in mastering the sport of archery, the first few days of instruction should be taken in the classroom. Two positions, the grip and the anchor, which can be taught and rehearsed in the classroom, are described. (JN)
People's Intuitions about Randomness and Probability: An Empirical Study
Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques
2006-01-01
What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…
Farrell, A P; Steffensen, J F
1987-01-01
The maximum aerobic swimming speed of Chinook salmon (Oncorhynchus tshawytscha) was measured before and after ligation of the coronary artery. Coronary artery ligation prevented blood flow to the compact layer of the ventricular myocardium, which represents 30% of the ventricular mass, and produced...... a statistically significant 35.5% reduction in maximum swimming speed. We conclude that the coronary circulation is important for maximum aerobic swimming and implicit in this conclusion is that maximum cardiac performance is probably necessary for maximum aerobic swimming performance....
Hf Transition Probabilities and Abundances
Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...
Gd Transition Probabilities and Abundances
Den Hartog, E A; Sneden, C; Cowan, J J
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...
Efficient Geometric Probabilities of Multi-Transiting Exoplanetary Systems from CORBITS
Brakensiek, Joshua; Ragozzine, Darin
2016-04-01
NASA’s Kepler Space Telescope has successfully discovered thousands of exoplanet candidates using the transit method, including hundreds of stars with multiple transiting planets. In order to estimate the frequency of these valuable systems, it is essential to account for the unique geometric probabilities of detecting multiple transiting extrasolar planets around the same parent star. In order to improve on previous studies that used numerical methods, we have constructed an efficient, semi-analytical algorithm called the Computed Occurrence of Revolving Bodies for the Investigation of Transiting Systems (CORBITS), which, given a collection of conjectured exoplanets orbiting a star, computes the probability that any particular group of exoplanets can be observed to transit. The algorithm applies theorems of elementary differential geometry to compute the areas bounded by circular curves on the surface of a sphere. The implemented algorithm is more accurate and orders of magnitude faster than previous algorithms, based on comparisons with Monte Carlo simulations. We use CORBITS to show that the present solar system would only show a maximum of three transiting planets, but that this varies over time due to dynamical evolution. We also used CORBITS to geometrically debias the period ratio and mutual Hill sphere distributions of Kepler's multi-transiting planet candidates, which results in shifting these distributions toward slightly larger values. In an Appendix, we present additional semi-analytical methods for determining the frequency of exoplanet mutual events, i.e., the geometric probability that two planets will transit each other (planet-planet occultation, relevant to transiting circumbinary planets) and the probability that this transit occurs simultaneously as they transit their star. The CORBITS algorithms and several worked examples are publicly available.
The Sherpa Maximum Likelihood Estimator
Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.
2011-07-01
A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.
Vestige: Maximum likelihood phylogenetic footprinting
Maxwell Peter
2005-05-01
Full Text Available Abstract Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational
Fibonacci Sequence, Recurrence Relations, Discrete Probability Distributions and Linear Convolution
Rajan, Arulalan; Rao, Ashok; Jamadagni, H S
2012-01-01
The classical Fibonacci sequence is known to exhibit many fascinating properties. In this paper, we explore the Fibonacci sequence and integer sequences generated by second order linear recurrence relations with positive integer coe?cients from the point of view of probability distributions that they induce. We obtain the generalizations of some of the known limiting properties of these probability distributions and present certain optimal properties of the classical Fibonacci sequence in this context. In addition, we also look at the self linear convolution of linear recurrence relations with positive integer coefficients. Analysis of self linear convolution is focused towards locating the maximum in the resulting sequence. This analysis, also highlights the influence that the largest positive real root, of the "characteristic equation" of the linear recurrence relations with positive integer coefficients, has on the location of the maximum. In particular, when the largest positive real root is 2,the locatio...
Probability of inflation in loop quantum cosmology
Ashtekar, Abhay; Sloan, David
2011-12-01
Inflationary models of the early universe provide a natural mechanism for the formation of large scale structure. This success brings to forefront the question of naturalness: Does a sufficiently long slow roll inflation occur generically or does it require a careful fine tuning of initial parameters? In recent years there has been considerable controversy on this issue (Hollands and Wald in Gen Relativ Gravit, 34:2043, 2002; Kofman et al. in J High Energy Phys 10:057, 2002); (Gibbons and Turok in Phys Rev D 77:063516, 2008). In particular, for a quadratic potential, Kofman et al. (J High Energy Phys 10:057, 2002) have argued that the probability of inflation with at least 65 e-foldings is close to one, while Gibbons and Turok (Phys Rev D 77:063516, 2008) have argued that this probability is suppressed by a factor of ~10-85. We first clarify that such dramatically different predictions can arise because the required measure on the space of solutions is intrinsically ambiguous in general relativity. We then show that this ambiguity can be naturally resolved in loop quantum cosmology (LQC) because the big bang is replaced by a big bounce and the bounce surface can be used to introduce the structure necessary to specify a satisfactory measure. The second goal of the paper is to present a detailed analysis of the inflationary dynamics of LQC using analytical and numerical methods. By combining this information with the measure on the space of solutions, we address a sharper question than those investigated in Kofman et al. (J High Energy Phys 10:057, 2002), Gibbons and Turok (Phys Rev D 77:063516, 2008), Ashtekar and Sloan (Phys Lett B 694:108, 2010): What is the probability of a sufficiently long slow roll inflation which is compatible with the seven year WMAP data? We show that the probability is very close to 1. The material is so organized that cosmologists who may be more interested in the inflationary dynamics in LQC than in the subtleties associated with
Taufiqurrahman Nasihun
2015-06-01
Full Text Available The emerging concept of successful aging is based on evidence that in healthy individual when they get aged, there areÂ considerable variations in physiological functions alteration. Some people exhibiting greater, but others very few or no age related alteration. The first is called poor aging and the later is called successful pattern of aging (Lambert SW, 2008. Thus, in the simple words the successful aging concept is define as an opportunity of old people to stayÂ active and productive condition despite they get aged chronologically. Aging itself might be defined as the progressive accumulation of changes with time associated with or responsible for the ever-increasing susceptibility to disease and death which accompanies advancing age (Harman D, 1981. The time needed to accumulate changes is attributable to aging process. The marked emerging questions are how does aging happen and where does aging start? To answer these questions and because of the complexity of aging process, there are more than 300 aging theories have been proposed to explain how and where aging occured and started respectively. There are too many to enumerate theories and classification of aging process. In summary, all of these aging theories can be grouped into three clusters: 1. Genetics program theory, this theory suggests that aging is resulted from program directed by the genes; 2. Epigenetic theory, in these theory aging is resulted from environmental random events not determined by the genes; 3. Evolutionary theory, which propose that aging is a medium for disposal mortal soma in order to avoid competition between organism and their progeny for food and space, did not try to explain how aging occur, but possibly answer why aging occur (De la Fuente. 2009. Among the three groups of aging theories, the epigenetic theory is useful to explain and try to solve the enigma of aging which is prominently caused by internal and external environmental influences
A robust conditional approximation of marginal tail probabilities.
Brazzale, A. R.; Ventura, L.
2001-01-01
The aim of this contribution is to derive a robust approximate conditional procedure used to eliminate nuisance parameters in regression and scale models. Unlike the approximations to exact conditional solutions based on the likelihood function and on the maximum likelihood estimator, the robust conditional approximation of marginal tail probabilities does not suffer from lack of robustness to model misspecification. To assess the performance of the proposed robust conditional procedure the r...
Likelihood Principle and Maximum Likelihood Estimator of Location Parameter for Cauchy Distribution.
1986-05-01
consistency (or strong consistency) of maximum likelihood estimator has been studied by many researchers, for example, Wald (1949), Wolfowitz (1953, 1965...20, 595-601. [25] Wolfowitz , J. (1953). The method of maximum likelihood and Wald theory of decision functions. Indag. Math., Vol. 15, 114-119. [26...Probability Letters Vol. 1, No. 3, 197-202. [24] Wald , A. (1949). Note on the consistency of maximum likelihood estimates. Ann. Math. Statist., Vol
The preference of probability over negative values in action selection.
Neyedli, Heather F; Welsh, Timothy N
2015-01-01
It has previously been found that when participants are presented with a pair of motor prospects, they can select the prospect with the largest maximum expected gain (MEG). Many of those decisions, however, were trivial because of large differences in MEG between the prospects. The purpose of the present study was to explore participants' preferences when making non-trivial decisions between two motor prospects. Participants were presented with pairs of prospects that: 1) differed in MEG with either only the values or only the probabilities differing between the prospects; and 2) had similar MEG with one prospect having a larger probability of hitting the target and a higher penalty value and the other prospect a smaller probability of hitting the target but a lower penalty value. In different experiments, participants either had 400 ms or 2000 ms to decide between the prospects. It was found that participants chose the configuration with the larger MEG more often when the probability varied between prospects than when the value varied. In pairs with similar MEGs, participants preferred a larger probability of hitting the target over a smaller penalty value. These results indicate that participants prefer probability information over negative value information in a motor selection task.
Success and decisiveness on proper symmetric games
Freixas Bosch, Josep; Pons Vallès, Montserrat
2015-01-01
The final publication is available at Springer via http://dx.doi.org/10.1007/s10100-013-0332-5 This paper provides a complete study for the possible rankings of success and decisiveness for individuals in symmetric voting systems, assuming anonymous and independent probability distributions. It is proved that for any pair of symmetric voting systems it is always possible to rank success and decisiveness in opposite order whenever the common probability of voting for “acceptance...
Determinants of students' success at university
Danilowicz-Gösele, Kamila; Meya, Johannes; Schwager, Robert; Suntheim, Katharina
2014-01-01
This paper studies the determinants of academic success using a unique administrative data set of a German university. We show that high school grades are strongly associated with both graduation probabilities and final grades, whereas variables measuring social origin or income have only a smaller impact. Moreover, the link between high school performance and university success is shown to vary substantially across faculties. In some fields of study, the probability of graduating is rather l...
Barengoltz, Jack
2016-07-01
Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single
Post-Classical Probability Theory
Barnum, Howard
2012-01-01
This paper offers a brief introduction to the framework of "general probabilistic theories", otherwise known as the "convex-operational" approach the foundations of quantum mechanics. Broadly speaking, the goal of research in this vein is to locate quantum mechanics within a very much more general, but conceptually very straightforward, generalization of classical probability theory. The hope is that, by viewing quantum mechanics "from the outside", we may be able better to understand it. We illustrate several respects in which this has proved to be the case, reviewing work on cloning and broadcasting, teleportation and entanglement swapping, key distribution, and ensemble steering in this general framework. We also discuss a recent derivation of the Jordan-algebraic structure of finite-dimensional quantum theory from operationally reasonable postulates.
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
The Inductive Applications of Probability Calculus
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
How Life History Can Sway the Fixation Probability of Mutants.
Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne
2016-07-01
In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected.
Probability of an Error in Estimation of States of a Modulated Synchronous Flow of Physical Events
Gortsev, A. M.; Sirotina, M. N.
2016-11-01
A flow of physical events (photons, electrons, etc.) is considered. One of the mathematical models of such flows is a modulated synchronous doubly stochastic flow of events. Analytical results for conditional and unconditional probabilities of erroneous decision in optimal estimation of flow states upon the criterion of the a posteriori probability maximum are presented.
Evidence for Truncated Exponential Probability Distribution of Earthquake Slip
Thingbaijam, Kiran K. S.
2016-07-13
Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.
Fluctuating States: What is the Probability of a Thermodynamical Transition?
Alhambra, Álvaro M.; Oppenheim, Jonathan; Perry, Christopher
2016-10-01
If the second law of thermodynamics forbids a transition from one state to another, then it is still possible to make the transition happen by using a sufficient amount of work. But if we do not have access to this amount of work, can the transition happen probabilistically? In the thermodynamic limit, this probability tends to zero, but here we find that for finite-sized and quantum systems it can be finite. We compute the maximum probability of a transition or a thermodynamical fluctuation from any initial state to any final state and show that this maximum can be achieved for any final state that is block diagonal in the energy eigenbasis. We also find upper and lower bounds on this transition probability, in terms of the work of transition. As a by-product, we introduce a finite set of thermodynamical monotones related to the thermomajorization criteria which governs state transitions and compute the work of transition in terms of them. The trade-off between the probability of a transition and any partial work added to aid in that transition is also considered. Our results have applications in entanglement theory, and we find the amount of entanglement required (or gained) when transforming one pure entangled state into any other.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Statistical convergence of order $\\alpha$ in probability
Pratulananda Das; Sanjoy Ghosal; Sumit Som
2016-01-01
In this paper ideas of different types of convergence of a sequence of random variables in probability, namely, statistical convergence of order $\\alpha$ in probability, strong $p$-Ces$\\grave{\\mbox{a}}$ro summability of order $\\alpha$ in probability, lacunary statistical convergence or $S_{\\theta}$-convergence of order $\\alpha$ in probability, ${N_{\\theta}}$-convergence of order $\\alpha$ in probability have been introduced and their certain basic properties have been studied.
Probability of rupture of multiple fault segments
Andrews, D.J.; Schwerer, E.
2000-01-01
Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.
Probability seismic hazard maps of Southern Thailand
Chinda Sutiwanich
2012-09-01
Full Text Available Seismic hazard maps of southern Thailand were obtained from the integration of crustal fault, areal and subductionsource models using probability seismic hazard analysis and the application of a logic tree approach. The hazard maps showthe mean peak ground and spectral accelerations at 0.2, 0.3 and 1.0 second periods with a 10%, 5%, 2% and 0.5% probabilityof exceedance in 50-year hazard levels. The highest hazard areas were revealed to be in the Muang, Phanom, and Viphavadidistricts of Surat Thani province, the Thap Put district of Phang Nga province, and the Plai Phraya district of Krabi province.The lowest hazard areas are in the southernmost part of Thailand e.g. Yala, Pattani and Narathiwat provinces. The maximumvalues of the mean peak ground acceleration for the 475–9,975 yr return period are 0.28-0.52 g and the maximum spectralaccelerations at 0.2 seconds for the same return period are 0.52-0.80 g. Similar hazard is also obtained for different returnperiods. Presented seismic hazard maps are useful as a guideline for the future design of buildings, bridges or dams, for rocksites to resist earthquake forces.
How long do centenarians survive? Life expectancy and maximum lifespan.
Modig, K; Andersson, T; Vaupel, J; Rau, R; Ahlbom, A
2017-08-01
The purpose of this study was to explore the pattern of mortality above the age of 100 years. In particular, we aimed to examine whether Scandinavian data support the theory that mortality reaches a plateau at particularly old ages. Whether the maximum length of life increases with time was also investigated. The analyses were based on individual level data on all Swedish and Danish centenarians born from 1870 to 1901; in total 3006 men and 10 963 women were included. Birth cohort-specific probabilities of dying were calculated. Exact ages were used for calculations of maximum length of life. Whether maximum age changed over time was analysed taking into account increases in cohort size. The results confirm that there has not been any improvement in mortality amongst centenarians in the past 30 years and that the current rise in life expectancy is driven by reductions in mortality below the age of 100 years. The death risks seem to reach a plateau of around 50% at the age 103 years for men and 107 years for women. Despite the rising life expectancy, the maximum age does not appear to increase, in particular after accounting for the increasing number of individuals of advanced age. Mortality amongst centenarians is not changing despite improvements at younger ages. An extension of the maximum lifespan and a sizeable extension of life expectancy both require reductions in mortality above the age of 100 years. © 2017 The Association for the Publication of the Journal of Internal Medicine.
An improved probability mapping approach to assess genome mosaicism
Gogarten J Peter
2003-09-01
Full Text Available Abstract Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events.
Fusion probability in heavy nuclei
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections
Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.
2017-01-01
Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.
Probable Linezolid-Induced Pancytopenia
Nita Lakhani
2005-01-01
Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
Avoiding Negative Probabilities in Quantum Mechanics
Nyambuya, Golden Gadzirayi
2013-01-01
As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Measures, Probability and Holography in Cosmology
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We
Detection probabilities for time-domain velocity estimation
Jensen, Jørgen Arendt
1991-01-01
Estimation of blood velocities by time-domain cross-correlation of successive high frequency sampled ultrasound signals is investigated. It is shown that any velocity can result from the estimator regardless of the true velocity due to the nonlinear technique employed. Using a simple simulation...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...... the reliability of the velocity estimate in real time...
Receiver function estimated by maximum entropy deconvolution
吴庆举; 田小波; 张乃铃; 李卫平; 曾融生
2003-01-01
Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.
Fluoridation: strategies for success.
Isman, R
1981-07-01
Of 19 referenda on community water fluoridation held in the first six months of 1980, 17 were defeated. Among the postulated reasons are a growing distrust of government and the health establishment. The public remains largely ignorant of the purpose and benefits of fluoridation. The emotionalism surrounding the issue has made it difficult to generate public support outside of the health professions. Opponents have also learned to fight fluoridation with increasingly sophisticated techniques. Some of the strategies used in recent successful campaigns in Oakland, California, and Portland, Oregon are described; recommendations that can be applied to communities considering fluoridation include careful wording of ballot measures so they are unequivocally clear and simple; timing ballot measures with elections likely to draw the largest voter turnout; broadening the base of political and financial support; using a figurehead if possible; and making maximum use of the media.
Methods for fitting a parametric probability distribution to most probable number data.
Williams, Michael S; Ebel, Eric D
2012-07-01
Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two
Maximum Power from a Solar Panel
Michael Miller
2010-01-01
Full Text Available Solar energy has become a promising alternative to conventional fossil fuel sources. Solar panels are used to collect solar radiation and convert it into electricity. One of the techniques used to maximize the effectiveness of this energy alternative is to maximize the power output of the solar collector. In this project the maximum power is calculated by determining the voltage and the current of maximum power. These quantities are determined by finding the maximum value for the equation for power using differentiation. After the maximum values are found for each time of day, each individual quantity, voltage of maximum power, current of maximum power, and maximum power is plotted as a function of the time of day.
Brain MR image segmentation improved algorithm based on probability
Liao, Hengxu; Liu, Gang; Guo, Xiantang
2017-08-01
Local weight voting algorithm is a kind of current mainstream segmentation algorithm. It takes full account of the influences of the likelihood of image likelihood and the prior probabilities of labels on the segmentation results. But this method still can be improved since the essence of this method is to get the label with the maximum probability. If the probability of a label is 70%, it may be acceptable in mathematics. But in the actual segmentation, it may be wrong. So we use the matrix completion algorithm as a supplement. When the probability of the former is larger, the result of the former algorithm is adopted. When the probability of the later is larger, the result of the later algorithm is adopted. This is equivalent to adding an automatic algorithm selection switch that can theoretically ensure that the accuracy of the algorithm we propose is superior to the local weight voting algorithm. At the same time, we propose an improved matrix completion algorithm based on enumeration method. In addition, this paper also uses a multi-parameter registration model to reduce the influence that the registration made on the segmentation. The experimental results show that the accuracy of the algorithm is better than the common segmentation algorithm.
Predicting Maximum Sunspot Number in Solar Cycle 24
Nipa J Bhatt; Rajmal Jain; Malini Aggarwal
2009-03-01
A few prediction methods have been developed based on the precursor technique which is found to be successful for forecasting the solar activity. Considering the geomagnetic activity aa indices during the descending phase of the preceding solar cycle as the precursor, we predict the maximum amplitude of annual mean sunspot number in cycle 24 to be 111 ± 21. This suggests that the maximum amplitude of the upcoming cycle 24 will be less than cycles 21–22. Further, we have estimated the annual mean geomagnetic activity aa index for the solar maximum year in cycle 24 to be 20.6 ± 4.7 and the average of the annual mean sunspot number during the descending phase of cycle 24 is estimated to be 48 ± 16.8.
Resource-constrained maximum network throughput on space networks
Yanling Xing; Ning Ge; Youzheng Wang
2015-01-01
This paper investigates the maximum network through-put for resource-constrained space networks based on the delay and disruption-tolerant networking (DTN) architecture. Specifical y, this paper proposes a methodology for calculating the maximum network throughput of multiple transmission tasks under storage and delay constraints over a space network. A mixed-integer linear programming (MILP) is formulated to solve this problem. Simula-tions results show that the proposed methodology can successful y calculate the optimal throughput of a space network under storage and delay constraints, as wel as a clear, monotonic relationship between end-to-end delay and the maximum network throughput under storage constraints. At the same time, the optimization re-sults shine light on the routing and transport protocol design in space communication, which can be used to obtain the optimal network throughput.
Gaussian maximum likelihood and contextual classification algorithms for multicrop classification
Di Zenzo, Silvano; Bernstein, Ralph; Kolsky, Harwood G.; Degloria, Stephen D.
1987-01-01
The paper reviews some of the ways in which context has been handled in the remote-sensing literature, and additional possibilities are introduced. The problem of computing exhaustive and normalized class-membership probabilities from the likelihoods provided by the Gaussian maximum likelihood classifier (to be used as initial probability estimates to start relaxation) is discussed. An efficient implementation of probabilistic relaxation is proposed, suiting the needs of actual remote-sensing applications. A modified fuzzy-relaxation algorithm using generalized operations between fuzzy sets is presented. Combined use of the two relaxation algorithms is proposed to exploit context in multispectral classification of remotely sensed data. Results on both one artificially created image and one MSS data set are reported.
Maximum entropy reconstruction of spin densities involving non uniform prior
Schweizer, J.; Ressouche, E. [DRFMC/SPSMS/MDN CEA-Grenoble (France); Papoular, R.J. [CEA-Saclay, Gif sur Yvette (France). Lab. Leon Brillouin; Tasset, F. [Inst. Laue Langevin, Grenoble (France); Zheludev, A.I. [Brookhaven National Lab., Upton, NY (United States). Physics Dept.
1997-09-01
Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m({rvec r}), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for {rho}({rvec r}) = m({rvec r}). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional probability modulates visual search efficiency.
Cort, Bryan; Anderson, Britt
2013-01-01
We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Conditional Probability Modulates Visual Search Efficiency
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Peters, B. C., Jr.; Walker, H. F.
1976-01-01
The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.
Peters, B. C., Jr.; Walker, H. F.
1978-01-01
This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.
Smoothed log-concave maximum likelihood estimation with applications
Chen, Yining
2011-01-01
We study the smoothed log-concave maximum likelihood estimator of a probability distribution on $\\mathbb{R}^d$. This is a fully automatic nonparametric density estimator, obtained as a canonical smoothing of the log-concave maximum likelihood estimator. We demonstrate its attractive features both through an analysis of its theoretical properties and a simulation study. Moreover, we show how the estimator can be used as an intermediate stage of more involved procedures, such as constructing a classifier or estimating a functional of the density. Here again, the use of the estimator can be justified both on theoretical grounds and through its finite sample performance, and we illustrate its use in a breast cancer diagnosis (classification) problem.
A Maximum Entropy Modelling of the Rain Drop Size Distribution
Francisco J. Tapiador
2011-01-01
Full Text Available This paper presents a maximum entropy approach to Rain Drop Size Distribution (RDSD modelling. It is shown that this approach allows (1 to use a physically consistent rationale to select a particular probability density function (pdf (2 to provide an alternative method for parameter estimation based on expectations of the population instead of sample moments and (3 to develop a progressive method of modelling by updating the pdf as new empirical information becomes available. The method is illustrated with both synthetic and real RDSD data, the latest coming from a laser disdrometer network specifically designed to measure the spatial variability of the RDSD.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Inferring Beliefs as Subjectively Imprecise Probabilities
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....
Scoring Rules for Subjective Probability Distributions
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
The trajectory of the target probability effect.
Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B
2013-05-01
The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Cluster formation probability in the trans-tin and trans-lead nuclei
Santhosh, K.P. [School of Pure and Applied Physics, Kannur University, Payyanur Campus, Payyanur 670 327 (India)], E-mail: drkpsanthosh@gmail.com; Biju, R.K.; Sahadevan, Sabina [P.G. Department of Physics and Research Centre, Payyanur College, Payyanur 670 327 (India)
2010-07-01
Within our fission model, the Coulomb and proximity potential model (CPPM) cluster formation probabilities are calculated for different clusters ranging from carbon to silicon for the parents in the trans-tin and trans-lead regions. It is found that in trans-tin region the {sup 12}C, {sup 16}O, {sup 20}Ne and {sup 24}Mg clusters have maximum cluster formation probability and lowest half lives as compared to other clusters. In trans-lead region the {sup 14}C, {sup 18,20}O, {sup 23}F, {sup 24,26}Ne, {sup 28,30}Mg and {sup 34}Si clusters have the maximum cluster formation probability and minimum half life, which show that alpha like clusters are most probable for emission from trans-tin region while non-alpha clusters are probable from trans-lead region. These results stress the role of neutron proton symmetry and asymmetry of daughter nuclei in these two cases.
Cluster formation probability in the trans-tin and trans-lead nuclei
Santhosh, K P; Sahadevan, Sabina; 10.1016/j.nuclphysa.2010.03.004
2010-01-01
Within our fission model, the Coulomb and proximity potential model (CPPM) cluster formation probabilities are calculated for different clusters ranging from carbon to silicon for the parents in the trans-tin and trans- lead regions. It is found that in trans-tin region the 12^C, 16^O, 20^Ne and 24^Mg clusters have maximum cluster formation probability and lowest half lives as compared to other clusters. In trans-lead region the 14^C, 18, 20^O, 23^F, 24,26^Ne, 28,30^Mg and 34^Si clusters have the maximum cluster formation probability and minimum half life, which show that alpha like clusters are most probable for emission from trans-tin region while non-alpha clusters are probable from trans-lead region. These results stress the role of neutron proton symmetry and asymmetry of daughter nuclei in these two cases.
Failure-probability driven dose painting
Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)
2013-08-15
Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.
Successive quadratic programming multiuser detector
Mu Xuewen; Zhang Yaling; Liu Sanyang
2007-01-01
Based on the semidefinite programming relaxation of the CDMA maximum likelihood multiuser detection problem,a detection strategy by the successive quadratic programming algorithm is presented. Coupled with the randomized cut generation scheme, the suboptimal solution of the multiuser detection problem in obtained. Compared to the interior point methods previously reported based on semidefinite programming, simulations demonstrate that the successive quadratic programming algorithm often yields the similar BER performances of the multiuser detection problem. But the average CPU time of this approach is significantly reduced.
Philip D.GINGERICH
2010-01-01
The late Professor Minchen Chow started his career in vertebrate paleontology with a team of scholars working to clarify the biostratigraphy and paleogeography of Paleocene-Eocene change in man-malian faunas.The Paleocene-Eocene transition is the time of first appearance of artiodactyls.perisso-dactyls.and primates (APP taxa)in the fossil record.Recognition of the Paleocene as an epoch sepa-rate from the Eocene started to be accepted in 1911 following discovery of a new Clarkforkian latest-Pa-leocene mammalian fauna at Polecat Bench in western North America.Later the Paleocene-Eoceneboundary was shown to include an interval with dwarfed mammalian lineages.A Paleocene-Eocene car-bon isotope excursion (CIE) was discovered coincident with both mammalian dwarfing and the first ap-pearance of APP taxa.This enabled global generalization of the CIE,which was linked in turn to the Paleocene-Eocene thermal maximum(PETM).The PETM is a global greenhouse warming event that had transient effects on the earth's climate and biota,but also profound lasting effects on the biota.Much of what we know about mammals in relation to the CIE and PETM we have learned through high-resolution study of the exceptional stratigraphic record flanking Polecat Bench,where Professor Chow worked early in his career and where his ashes now lie.%跟随一队研究哺乳动物群在古新世-始新世变化的生物地层学和古地理学的学者,已故周明镇教授开始了他的古脊椎动物学事业.这是化石记录中偶蹄类、奇蹄类和灵长类(APP类群)首次出现的时期.随着北美西部Polecat Bench新的最晚古新世Clarkforkian哺乳动物群的发现,古新世作为一个独立于始新世的分期从1911年开始被接受.后来,研究证明古新世-始新世界线包括了一段时间,这期间发育了矮小型哺乳动物支系.古新世-始新世碳同位素漂移(CIE)与哺乳动物矮小化以及APP类群的首次出现是同时的.据此可以对CIE进行全球
Probable flood predictions in ungauged coastal basins of El Salvador
Friedel, M.J.; Smith, M.E.; Chica, A.M.E.; Litke, D.
2008-01-01
A regionalization procedure is presented and used to predict probable flooding in four ungauged coastal river basins of El Salvador: Paz, Jiboa, Grande de San Miguel, and Goascoran. The flood-prediction problem is sequentially solved for two regions: upstream mountains and downstream alluvial plains. In the upstream mountains, a set of rainfall-runoff parameter values and recurrent peak-flow discharge hydrographs are simultaneously estimated for 20 tributary-basin models. Application of dissimilarity equations among tributary basins (soft prior information) permitted development of a parsimonious parameter structure subject to information content in the recurrent peak-flow discharge values derived using regression equations based on measurements recorded outside the ungauged study basins. The estimated joint set of parameter values formed the basis from which probable minimum and maximum peak-flow discharge limits were then estimated revealing that prediction uncertainty increases with basin size. In the downstream alluvial plain, model application of the estimated minimum and maximum peak-flow hydrographs facilitated simulation of probable 100-year flood-flow depths in confined canyons and across unconfined coastal alluvial plains. The regionalization procedure provides a tool for hydrologic risk assessment and flood protection planning that is not restricted to the case presented herein. ?? 2008 ASCE.
The inverse maximum dynamic flow problem
BAGHERIAN; Mehri
2010-01-01
We consider the inverse maximum dynamic flow (IMDF) problem.IMDF problem can be described as: how to change the capacity vector of a dynamic network as little as possible so that a given feasible dynamic flow becomes a maximum dynamic flow.After discussing some characteristics of this problem,it is converted to a constrained minimum dynamic cut problem.Then an efficient algorithm which uses two maximum dynamic flow algorithms is proposed to solve the problem.
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
Maximum permissible voltage of YBCO coated conductors
Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z. [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Hong, Z., E-mail: zhiyong.hong@sjtu.edu.cn [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang, D.; Zhou, H.; Shen, X.; Shen, C. [Qingpu Power Supply Company, State Grid Shanghai Municipal Electric Power Company, Shanghai (China)
2014-06-15
Highlights: • We examine three kinds of tapes’ maximum permissible voltage. • We examine the relationship between quenching duration and maximum permissible voltage. • Continuous I{sub c} degradations under repetitive quenching where tapes reaching maximum permissible voltage. • The relationship between maximum permissible voltage and resistance, temperature. - Abstract: Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (I{sub c}) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the I{sub c} degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.
Manage Toward Success - Utilization of Analytics in Acquisition Decision Making
2015-04-01
this research to build the inference network for evidential reasoning. Its basis is the Bayesian approach of probability and statistics , which...views inference as belief dynamics and uses probability to quantify rational degrees of belief. Bayesian networks are direct acyclic graphs that contain...known as the DBS Acquisition Probability of Success (DAPS) model. Keywords: defense business system, acquisition, analytics, evidential reasoning
Statistical learning of action: the role of conditional probability.
Meyer, Meredith; Baldwin, Dare
2011-12-01
Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.
On the computability of conditional probability
Ackerman, Nathanael L; Roy, Daniel M
2010-01-01
We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...
Ensuring a successful family business management succession
Desbois, Joris
2016-01-01
Succession is the biggest long-term challenge that most family businesses face. Indeed, leaders ‘disposition to plan for their succession is frequently the key factor defining whether their family business subsists or stops. The research seeks to find out how to manage successfully the business management succession over main principles. This work project aims at researching the key points relevant to almost all family firms, to have a viable succession transition and positioni...
Maximum-Entropy Method for Evaluating the Slope Stability of Earth Dams
Shuai Wang
2012-10-01
Full Text Available The slope stability is a very important problem in geotechnical engineering. This paper presents an approach for slope reliability analysis based on the maximum-entropy method. The key idea is to implement the maximum entropy principle in estimating the probability density function. The performance function is formulated by the Simplified Bishop’s method to estimate the slope failure probability. The maximum-entropy method is used to estimate the probability density function (PDF of the performance function subject to the moment constraints. A numerical example is calculated and compared to the Monte Carlo simulation (MCS and the Advanced First Order Second Moment Method (AFOSM. The results show the accuracy and efficiency of the proposed method. The proposed method should be valuable for performing probabilistic analyses.
Generalised maximum entropy and heterogeneous technologies
Oude Lansink, A.G.J.M.
1999-01-01
Generalised maximum entropy methods are used to estimate a dual model of production on panel data of Dutch cash crop farms over the period 1970-1992. The generalised maximum entropy approach allows a coherent system of input demand and output supply equations to be estimated for each farm in the sam
20 CFR 229.48 - Family maximum.
2010-04-01
... month on one person's earnings record is limited. This limited amount is called the family maximum. The family maximum used to adjust the social security overall minimum rate is based on the employee's Overall..., when any of the persons entitled to benefits on the insured individual's compensation would, except...
The maximum rotation of a galactic disc
Bottema, R
1997-01-01
The observed stellar velocity dispersions of galactic discs show that the maximum rotation of a disc is on average 63% of the observed maximum rotation. This criterion can, however, not be applied to small or low surface brightness (LSB) galaxies because such systems show, in general, a continuously
Bell Could Become the Copernicus of Probability
Khrennikov, Andrei
2016-07-01
Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.
Metabolic networks evolve towards states of maximum entropy production.
Unrean, Pornkamol; Srienc, Friedrich
2011-11-01
A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles.
Estimating the probability of rare events: addressing zero failure data.
Quigley, John; Revie, Matthew
2011-07-01
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials.
Duality of Maximum Entropy and Minimum Divergence
Shinto Eguchi
2014-06-01
Full Text Available We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.
Probability of graphs with large spectral gap by multicanonical Monte Carlo
Saito, Nen; Iba, Yukito
2011-01-01
Graphs with large spectral gap are important in various fields such as biology, sociology and computer science. In designing such graphs, an important question is how the probability of graphs with large spectral gap behaves. A method based on multicanonical Monte Carlo is introduced to quantify the behavior of this probability, which enables us to calculate extreme tails of the distribution. The proposed method is successfully applied to random 3-regular graphs and large deviation probability is estimated.
Probability distribution of extreme share returns in Malaysia
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Assigning probability distributions to input parameters of performance assessment models
Mishra, Srikanta [INTERA Inc., Austin, TX (United States)
2002-02-01
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.
Probability to retrieve testicular spermatozoa in azoospermic patients
H.-J.Glander; L.-C.Horn; W.Dorschner; U.Paasch; J.Kratzsch
2000-01-01
Aim: The degree of probability to retrieve spermatozoa from testicular tissue for intracytoplasmic sperm injection into oocytes is of interest for counselling of infertility patients. We investigated the relation of sperm retrieval to clinical data and histological pattern in testicular biopsies from azoospermic patients. Methods: In 264 testicular biopsies from 142 azoospermic patients, the testicular tissue was shredded to separate the spermatozoa, histological semi - thin sections of which were then evaluated using Johnsen score. Results: The retrieval of spermatozoa correlated significantly ( P 18 U/L, testicular volume < 5 mL, mean Johnsen score<5, and maximum Johnsen score < 7.
Parameter space for successful soccer kicks
Cook, Brandon G.; Goff, John Eric
2006-07-01
A computational model of two important types of soccer kicks, the free kick and the corner kick, is developed with the goal of determining the success rate for each type of kick. What is meant by 'success rate' is the probability of getting an unassisted goal via a free kick and the probability of having a corner kick reach an optimum location so that a teammate's chance of scoring a goal is increased. Success rates are determined through the use of four-dimensional parameter space volumes. A one-in-ten success rate is found for the free kick while the corner-kick success rate is found to be one in four.
Probability Judgements in Multi-Stage Problems : Experimental Evidence of Systematic Biases
Gneezy, U.
1996-01-01
We report empirical evidence that in problems of random walk with positive drift, bounded rationality leads individuals to under-estimate the probability of success in the long run.In particular, individuals who were given the stage by stage probability distribution failed to aggregate this
Probability Judgements in Multi-Stage Problems : Experimental Evidence of Systematic Biases
Gneezy, U.
1996-01-01
We report empirical evidence that in problems of random walk with positive drift, bounded rationality leads individuals to under-estimate the probability of success in the long run.In particular, individuals who were given the stage by stage probability distribution failed to aggregate this informat
Towards a Categorical Account of Conditional Probability
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL probability wheel
Sheng-Cheng Huang
2016-01-01
Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Total variation denoising of probability measures using iterated function systems with probabilities
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
2010-01-01
... any flight safety system on the vehicle, including a description of operations and component location... vehicle (on-range, off-range, and down-range) and specific, unique facilities exposed to risk. Scenarios... in the license application. C. On-orbit risk analysis assessing risks posed by a launch vehicle to...
Bayesian Probabilities and the Histories Algebra
Marlow, Thomas
2006-01-01
We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.
Non-Boolean probabilities and quantum measurement
Niestegge, Gerd
2001-08-03
A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)