WorldWideScience

Sample records for generated probability curves

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  2. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  5. PROBABILITY MODEL OF GUNTHER GENERATOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.

  6. Generating artificial light curves: Revisited and updated

    CERN Document Server

    Emmanoulopoulos, D; Papadakis, I E

    2013-01-01

    The production of artificial light curves with known statistical and variability properties is of great importance in astrophysics. Consolidating the confidence levels during cross-correlation studies, understanding the artefacts induced by sampling irregularities, establishing detection limits for future observatories are just some of the applications of simulated data sets. Currently, the widely used methodology of amplitude and phase randomisation is able to produce artificial light curves which have a given underlying power spectral density (PSD) but which are strictly Gaussian distributed. This restriction is a significant limitation, since the majority of the light curves e.g. active galactic nuclei, X-ray binaries, gamma-ray bursts show strong deviations from Gaussianity exhibiting `burst-like' events in their light curves yielding long-tailed probability distribution functions (PDFs). In this study we propose a simple method which is able to precisely reproduce light curves which match both the PSD an...

  7. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  8. Three-generation neutrino oscillations in curved spacetime

    Science.gov (United States)

    Zhang, Yu-Hao; Li, Xue-Qian

    2016-10-01

    Three-generation MSW effect in curved spacetime is studied and a brief discussion on the gravitational correction to the neutrino self-energy is given. The modified mixing parameters and corresponding conversion probabilities of neutrinos after traveling through celestial objects of constant densities are obtained. The method to distinguish between the normal hierarchy and inverted hierarchy is discussed in this framework. Due to the gravitational redshift of energy, in some extreme situations, the resonance energy of neutrinos might be shifted noticeably and the gravitational effect on the self-energy of neutrino becomes significant at the vicinities of spacetime singularities.

  9. Three-generation neutrino oscillations in curved spacetime

    CERN Document Server

    Zhang, Yu-Hao

    2016-01-01

    Three-generation MSW effect in curved spacetime is studied and a brief discussion on the gravitational correction to the neutrino self-energy is given. The modified mixing parameters and corresponding conversion probabilities of neutrinos after traveling through celestial objects of constant densities are obtained. The method to distinguish between the normal hierarchy and inverted hierarchy is discussed in this framework. Due to the gravitational redshift of energy, in some extreme situations, the resonance energy of neutrinos might be shifted noticeably and the gravitational effect on the self-energy of neutrino becomes significant at the vicinities of spacetime singularities.

  10. Three-generation neutrino oscillations in curved spacetime

    Directory of Open Access Journals (Sweden)

    Yu-Hao Zhang

    2016-10-01

    Full Text Available Three-generation MSW effect in curved spacetime is studied and a brief discussion on the gravitational correction to the neutrino self-energy is given. The modified mixing parameters and corresponding conversion probabilities of neutrinos after traveling through celestial objects of constant densities are obtained. The method to distinguish between the normal hierarchy and inverted hierarchy is discussed in this framework. Due to the gravitational redshift of energy, in some extreme situations, the resonance energy of neutrinos might be shifted noticeably and the gravitational effect on the self-energy of neutrino becomes significant at the vicinities of spacetime singularities.

  11. Estimating probability curves of rock variables using orthogonal polynomials and sample moments

    Institute of Scientific and Technical Information of China (English)

    DENG Jian; BIAN Li

    2005-01-01

    A new algorithm using orthogonal polynomials and sample moments was presented for estimating probability curves directly from experimental or field data of rock variables. The moments estimated directly from a sample of observed values of a random variable could be conventional moments (moments about the origin or central moments) and probability-weighted moments (PWMs). Probability curves derived from orthogonal polynomials and conventional moments are probability density functions (PDF), and probability curves derived from orthogonal polynomials and PWMs are inverse cumulative density functions (CDF) of random variables. The proposed approach is verified by two most commonly-used theoretical standard distributions: normal and exponential distribution. Examples from observed data of uniaxial compressive strength of a rock and concrete strength data are presented for illustrative purposes. The results show that probability curves of rock variable can be accurately derived from orthogonal polynomials and sample moments. Orthogonal polynomials and PWMs enable more secure inferences to be made from relatively small samples about an underlying probability curve.

  12. ON THE TOPOLOGY OF MECHANISMS DESIGNED FOR CURVES GENERATION

    Directory of Open Access Journals (Sweden)

    MEREUTA Elena

    2008-07-01

    Full Text Available The paper presents some mechanisms used for generating simple or complex curves. The mechanisms are shown in different positions and for some special curves the demonstrations are performed.

  13. Algorithm for Automatic Generation of Curved and Compound Twills

    Institute of Scientific and Technical Information of China (English)

    WANG Mei-zhen; WANG Fu-mei; WANG Shan-yuan

    2005-01-01

    A new arithmetic using matrix left-shift functions for the quicker generation of curved and compound twills is introduced in this paper. A matrix model for the generation of regular, curved and compound twill structures is established and its computing simulation realization are elaborated. Examples of the algorithm applying in the simulation and the automatic generation of curved and compound twills in fabric CAD are obtained.

  14. Generating Probability Distributions using Multivalued Stochastic Relay Circuits

    CERN Document Server

    Lee, David

    2011-01-01

    The problem of random number generation dates back to von Neumann's work in 1951. Since then, many algorithms have been developed for generating unbiased bits from complex correlated sources as well as for generating arbitrary distributions from unbiased bits. An equally interesting, but less studied aspect is the structural component of random number generation as opposed to the algorithmic aspect. That is, given a network structure imposed by nature or physical devices, how can we build networks that generate arbitrary probability distributions in an optimal way? In this paper, we study the generation of arbitrary probability distributions in multivalued relay circuits, a generalization in which relays can take on any of N states and the logical 'and' and 'or' are replaced with 'min' and 'max' respectively. Previous work was done on two-state relays. We generalize these results, describing a duality property and networks that generate arbitrary rational probability distributions. We prove that these network...

  15. GENERALIZED FATIGUE CONSTANT LIFE CURVE AND TWO-DIMENSIONAL PROBABILITY DISTRIBUTION OF FATIGUE LIMIT

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 武哲; 高镇同

    2002-01-01

    According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.

  16. Isomorphism and Generation of Montgomery-Form Elliptic Curves Suitable for Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    LIU Duo; SONG Tao; DAI Yiqi

    2005-01-01

    Many efficient algorithms of Montgomery-form elliptic curve cryptology have been investigated recently. At present, there are no reported studies of the isomorphic class of the Montgomery-form elliptic curve over a finite field. This paper investigates the isomorphism of Montgomery-form elliptic curves via the isomorphism of Weierstrass-form elliptic curves and gives a table of (nearly) all the forms of Montgomery-form elliptic curves suitable for cryptographic usage. Then, an algorithm for generating a secure elliptic curve with Montgomery-form is presented. The most important advantages of the new algorithm are that it avoids the transformation from an elliptic curve's Weierstrass-form to its Montgomery-form, and that it decreases the probability of collision. So, the proposed algorithem is quicker, simpler, and more efficient than the old ones.

  17. Prediction and extension of curves of distillation of vacuum residue using probability functions

    Science.gov (United States)

    León, A. Y.; Riaño, P. A.; Laverde, D.

    2016-02-01

    The use of the probability functions for the prediction of crude distillation curves has been implemented in different characterization studies for refining processes. The study of four functions of probability (Weibull extreme, Weibull, Kumaraswamy and Riazi), was analyzed in this work for the fitting of curves of distillation of vacuum residue. After analysing the experimental data was selected the Weibull extreme function as the best prediction function, the fitting capability of the best function was validated considering as criterions of estimation the AIC (Akaike Information Criterion), BIC (Bayesian information Criterion), and correlation coefficient R2. To cover a wide range of composition were selected fifty-five (55) vacuum residue derived from different hydrocarbon mixture. The parameters of the probability function Weibull Extreme were adjusted from simple measure properties such as Conradson Carbon Residue (CCR), and compositional analysis SARA (saturates, aromatics, resins and asphaltenes). The proposed method is an appropriate tool to describe the tendency of distillation curves and offers a practical approach in terms of classification of vacuum residues.

  18. Sharp Bounds by Probability-Generating Functions and Variable Drift

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten

    2011-01-01

    We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al. (G...

  19. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    Science.gov (United States)

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  20. On a framework for generating PoD curves assisted by numerical simulations

    Science.gov (United States)

    Subair, S. Mohamed; Agrawal, Shweta; Balasubramaniam, Krishnan; Rajagopal, Prabhu; Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar

    2015-03-01

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  1. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  2. Optimized lower leg injury probability curves from postmortem human subject tests under axial impacts.

    Science.gov (United States)

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko

    2014-01-01

    Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 k

  3. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  4. PLANAR MECHANISMS USED FOR GENERATING CURVE LINE TRANSLATION MOTION

    Directory of Open Access Journals (Sweden)

    Ovidiu ANTONESCU

    2015-05-01

    Full Text Available The curve line translation motion can be generated in the particular form of the circular translation, through mono-mobile mechanisms with articulated links of simple parallelogram type (with a fixed side or through transmission with toothed belt with a fixed wheel. Also, the circular translation can be generated through planar mechanisms with two cylindrical gears with a fixed central wheel. It is mentioned that the two cylindrical gearings of the Fergusson mechanisms are both exterior and interior.

  5. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  6. Demand and choice probability generating functions for perturbed consumers

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2011-01-01

    generating function to be consistent with utility maximization. Within a budget, the convex hull of the demand correspondence is the subdifferential of the demand generating function. The additive random utility discrete choice model (ARUM) is a special case with finite budget sets where utility...

  7. Finite-order universal portfolios generated by probability mass functions

    Science.gov (United States)

    Tan, Choon Peng; Chu, Sin Yen; Pan, Wei Yeing

    2015-05-01

    It is shown that the finite-order universal portfolios generated by independent discrete random variables are constant rebalanced portfolios. The case where the universal portfolios are generated by the moments of the joint Dirichlet distribution is studied. The performance of the low-order Dirichlet universal portfolios on some stock-price data set is analyzed. It is demonstrated that the performance is comparable and in some cases outperform the moving-order Cover-Ordentlich universal portfolios with faster implementation time and higher wealth achieved.

  8. Generation of Closed Timelike Curves with Rotating Superconductors

    CERN Document Server

    De Matos, C J

    2006-01-01

    The spacetime metric around a rotating SuperConductive Ring (SCR) is deduced from the gravitomagnetic London moment in rotating superconductors. It is shown that Closed Timelike Curves (CTC) are present inside the superconductive ring's hole. The possibility to use these CTC's to travel in time as initially idealized by G\\"{o}del is investigated.

  9. Collisions in Fast Generation of Ideal Classes and Points on Hyperelliptic and Elliptic Curves

    DEFF Research Database (Denmark)

    Lange, Tanja; Shparlinski, Igor

    2005-01-01

    Koblitz curves have been proposed to quickly generate random ideal classes and points on hyperelliptic and elliptic curves. To obtain a further speed-up a different way of generating these random elements has recently been proposed. In this paper we give an upper bound on the number of collisions...

  10. Banking on a bad bet. Probability matching in risky choice is linked to expectation generation.

    Science.gov (United States)

    James, Greta; Koehler, Derek J

    2011-06-01

    Probability matching is the tendency to match choice probabilities to outcome probabilities in a binary prediction task. This tendency is a long-standing puzzle in the study of decision making under risk and uncertainty, because always predicting the more probable outcome across a series of trials (maximizing) would yield greater predictive accuracy and payoffs. In three experiments, we tied the predominance of probability matching over maximizing to a generally adaptive cognitive operation that generates expectations regarding the aggregate outcomes of an upcoming sequence of events. Under conditions designed to diminish the generation or perceived applicability of such expectations, we found that the frequency of probability-matching behavior dropped substantially and maximizing became the norm.

  11. A fast direct point-by-point generating algorithm for B Spline curves and surfaces

    Institute of Scientific and Technical Information of China (English)

    LI Zhong; HAN Dan-fu

    2005-01-01

    Traditional generating algorithms for B Spline curves and surfaces require approximation methods where how to increment the parameter to get the best approximation is problematic; or they take the pixel-based method needing matrix transformation from B Spline representation to Bezier form. Here, a fast, direct point-by-point generating algorithm for B Spline curves and surfaces is presented. The algorithm does not need matrix transformation, can be used for uniform or nonuniform B Spline curves and surfaces of any degree, and has high generating speed and good rendering accuracy.

  12. Generation of Discrete Bicubic G1 B-Spline Ship Hullform Surfaces from a Given Curve Network Using Virtual Iso-Parametric Curves

    Institute of Scientific and Technical Information of China (English)

    Joong-Hyun Rhim; Doo-Yeoun Cho; Kyu-Yeul Lee; Tae-Wan Kim

    2006-01-01

    We propose a method that automatically generates discrete bicubic G1 continuous B-spline surfaces that interpolate the curve network of a ship hullform. First, the curves in the network are classified into two types: boundary curves and "reference curves". The boundary curves correspond to a set of rectangular (or triangular) topological type that can be represented with tensor-product (or degenerate) B-spline surface patches. Next, in the interior of the patches,surface fitting points and cross boundary derivatives are estimated from the reference curves by constructing "virtual" isoparametric curves. Finally, a discrete G1 continuous B-spline surface is generated by a surface fitting algorithm. Several smooth ship hullform surfaces generated from curve networks corresponding to actual ship hullforms demonstrate the quality of the method.

  13. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Jim W.; Lawry, Jonathan

    2004-09-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM.

  14. Log-cubic method for generation of soil particle size distribution curve.

    Science.gov (United States)

    Shang, Songhao

    2013-01-01

    Particle size distribution (PSD) is a fundamental physical property of soils. Traditionally, the PSD curve was generated by hand from limited data of particle size analysis, which is subjective and may lead to significant uncertainty in the freehand PSD curve and graphically estimated cumulative particle percentages. To overcome these problems, a log-cubic method was proposed for the generation of PSD curve based on a monotone piecewise cubic interpolation method. The log-cubic method and commonly used log-linear and log-spline methods were evaluated by the leave-one-out cross-validation method for 394 soil samples extracted from UNSODA database. Mean error and root mean square error of the cross-validation show that the log-cubic method outperforms two other methods. What is more important, PSD curve generated by the log-cubic method meets essential requirements of a PSD curve, that is, passing through all measured data and being both smooth and monotone. The proposed log-cubic method provides an objective and reliable way to generate a PSD curve from limited soil particle analysis data. This method and the generated PSD curve can be used in the conversion of different soil texture schemes, assessment of grading pattern, and estimation of soil hydraulic parameters and erodibility factor.

  15. Developing intensity duration frequency curves based on scaling theory using linear probability weighted moments: A case study from India

    Science.gov (United States)

    Bairwa, Arvind Kumar; Khosa, Rakesh; Maheswaran, R.

    2016-11-01

    In this study, presence of multi-scale behaviour in rainfall IDF relationship has been established using Linear Probability Weighted Moments (LPWMs) for some selected stations in India. Simple, non-central moments (SMs) have seen widespread use in similar scaling studies but these latter statistical attributes are known to mask the 'true' scaling pattern and, consequently, leading to inappropriate inferences. There is a general agreement amongst researchers that conventional higher order moments do indeed amplify the extreme observations and drastically affect scaling exponents. Additional advantage of LPWMs over SMs is that they exist even when the standard moments do not exist. As an alternative, this study presents a comparison with results based on use of the robust LPWMs which have revealed, in sharp contrast with the conventional moments, a definitive multi-scaling behaviour in all four rainfall observation stations that were selected from different climatic zones. The multi-scale IDF curves derived using LPWMs show a good agreement with observations and it is accordingly concluded that LPWMs provide a more reliable tool for investigating scaling in sequences of observed rainfall corresponding to various durations.

  16. The S-curve for forecasting waste generation in construction projects.

    Science.gov (United States)

    Lu, Weisheng; Peng, Yi; Chen, Xi; Skitmore, Martin; Zhang, Xiaoling

    2016-10-01

    Forecasting construction waste generation is the yardstick of any effort by policy-makers, researchers, practitioners and the like to manage construction and demolition (C&D) waste. This paper develops and tests an S-curve model to indicate accumulative waste generation as a project progresses. Using 37,148 disposal records generated from 138 building projects in Hong Kong in four consecutive years from January 2011 to June 2015, a wide range of potential S-curve models are examined, and as a result, the formula that best fits the historical data set is found. The S-curve model is then further linked to project characteristics using artificial neural networks (ANNs) so that it can be used to forecast waste generation in future construction projects. It was found that, among the S-curve models, cumulative logistic distribution is the best formula to fit the historical data. Meanwhile, contract sum, location, public-private nature, and duration can be used to forecast construction waste generation. The study provides contractors with not only an S-curve model to forecast overall waste generation before a project commences, but also with a detailed baseline to benchmark and manage waste during the course of construction. The major contribution of this paper is to the body of knowledge in the field of construction waste generation forecasting. By examining it with an S-curve model, the study elevates construction waste management to a level equivalent to project cost management where the model has already been readily accepted as a standard tool.

  17. Dissimilar Metal Weld Probability of Detection Curve Fits from Performance Demonstration Initiative Data: A Comparison with Other Round-Robin Results

    Energy Technology Data Exchange (ETDEWEB)

    Heasler, Patrick G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Anderson, Michael T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Doctor, Steven R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-20

    The NRC, in cooperation with industry, is developing a computerized simulation and analytical tool within the Extremely Low Probability of Rupture (xLPR) Project to provide insights for determining whether certain types of service degradation would be expected to challenge safety-related systems at operating nuclear power plants. One input for this tool is the probability of detection (POD) for the nondestructive examinations conducted during inservice inspections at these plants. EPRI produced a series of POD curves for ultrasonic testing with data from the industry’s Performance Demonstration Initiative. This report compares the POD curves developed from the EPRI data to other relevant attempts to quantify POD on similar component configurations. The objectives of this report are 1) to determine the reasonableness of the EPRI curves and 2) attempt to explain discrepancies noted with other recent POD studies.

  18. The Qualitative Analysis of Theoretic Curves Generated by Linear Viscoelasticity Constitutive Equation

    Directory of Open Access Journals (Sweden)

    A. V. Khohlov

    2016-01-01

    Full Text Available The article analyses a one-dimensional linear integral constitutive equation of viscoelasticity with an arbitrary creep compliance function in order to reveal its abilities to describe the set of basic rheological phenomena pertaining to viscoelastoplastic materials at a constant temperature. General equations and basic properties of its quasi-static theoretic curves (i.e. stress-strain curves at constant strain or stress rates, creep, creep recovery, creep curves at piecewise-constant stress and ramp relaxation curves generated by the linear constitutive equation are derived and studied analytically. Their dependences on a creep function and relaxation modulus and on the loading program parameters are examined.The qualitative properties of the theoretic curves are compared to the typical properties of viscoelastoplastic materials test curves to reveal the mechanical effects, which the linear viscoelasticity theory cannot simulate and to find out convenient experimental indicators marking the field of its applicability or non-applicability. The minimal set of general restrictions that should be imposed on a creep and relaxation functions to provide an adequate description of typical test curves of viscoelastoplastic materials is formulated. It is proved, in particular, that an adequate simulation of typical experimental creep recovery curves requires that the derivative of a creep function should not increase at any point. This restriction implies that the linear viscoelasticity theory yields theoretical creep curves with non-increasing creep rate only and it cannot simulate materials demonstrating an accelerated creep stage. It is also proved that the linear viscoelasticity cannot simulate materials with experimental stress-strain curves possessing a maximum point or concave-up segment and materials exhibiting equilibrium modulus dependence on the strain rate or negative rate sensitivity.Similar qualitative analysis seems to be an important

  19. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  20. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  1. Automated generation of curved planar reformations from MR images of the spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Ourselin, Sebastien [CSIRO ICT Centre, Autonomous Systems Laboratory, BioMedIA Lab, Locked Bag 17, North Ryde, NSW 2113 (Australia); Gomes, Lavier [Department of Radiology, Westmead Hospital, University of Sydney, Hawkesbury Road, Westmead NSW 2145 (Australia); Likar, Bostjan [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Pernus, Franjo [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2007-05-21

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7{sup 0} for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  2. Automated generation of curved planar reformations from MR images of the spine

    Science.gov (United States)

    Vrtovec, Tomaz; Ourselin, Sébastien; Gomes, Lavier; Likar, Boštjan; Pernuš, Franjo

    2007-05-01

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7° for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  3. The relative impact of sizing errors on steam generator tube failure probability

    Energy Technology Data Exchange (ETDEWEB)

    Cizelj, L.; Dvorsek, T. [Jozef Stefan Inst., Ljubljana (Slovenia)

    1998-07-01

    The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)

  4. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  5. Surfaces and Curves Corresponding to the Solutions Generated from Periodic “Seed” of NLS Equation

    Institute of Scientific and Technical Information of China (English)

    Ling ZHANG; Jing Song HE; Yi CHENG; Yi Shen LI

    2012-01-01

    The solutions q[n] generated from a periodic “seed” q =cei(as+bt) of the nonlinear Schr(o)dinger (NLS) by n-fold Darboux transformation is represented by determinant.Furthermore,the s-periodic solution and t-periodic solution are given explicitly by using q[1].The curves and surfaces (F1,F2,F3) associated with q[n] are given by means of Sym formula.Meanwhile,we show periodic and asymptotic properties of these curves.

  6. Generators for the l-torsion subgroup of Jacobians of Genus Two Curves

    DEFF Research Database (Denmark)

    Ravnshøj, Christian Robenhagen

    2008-01-01

    We give an explicit description of the matrix representation of the Frobenius endomorphism on the Jacobian of a genus two curve on the subgroup of l-torsion points. By using this description, we can describe the matrix representation of the Weil-pairing on the subgroup of l-torsion points...... explicitly. Finally, the explicit description of the Weil-pairing provides us with an efficient, probabilistic algorithm to find generators of the subgroup of l-torsion points on the Jacobian of a genus two curve....

  7. Generation of R-Curve from 4ENF Specimens: An Experimental Study

    Directory of Open Access Journals (Sweden)

    V. Alfred Franklin

    2014-01-01

    Full Text Available The experimental determination of the resistance to delamination is very important in aerospace applications as composite materials have superior properties only in the fiber direction. To measure the interlaminar fracture toughness of composite materials, different kinds of specimens and experimental methods are available. This article examines the fracture energy of four-point end-notched flexure (4ENF composite specimens made of carbon/epoxy and glass/epoxy. Experiments were conducted on these laminates and the mode II fracture energy, GIIC, was evaluated using compliance method and was compared with beam theory solution. The crack growth resistance curve (R-curve for these specimens was generated and the found glass/epoxy shows higher toughness values than carbon/epoxy composite. From this study, it was observed that R-curve effect in 4ENF specimens is quite mild, which means that the measured delamination toughness, GIIC, is more accurate.

  8. KGmax curves associated with second generation intact stability criteria for different types of ships

    Science.gov (United States)

    Grinnaert, Francois; Billard, Jean-Yves; Laurens, Jean-Marc

    2016-09-01

    Currently, second generation intact stability criteria are being developed and evaluated by the International Maritime Organization (IMO). In this paper, we briefly present levels 1 and 2 assessment methods for the criteria of pure loss of stability and parametric roll failure modes. Subsequently, we show the KGmax curves associated with these criteria. We compute these curves for five different types of ships and compare them with the curves embodied in the current regulations. The results show that the safety margin ensured by the first level-1 method of calculation for both pure loss of stability and parametric roll seems to be excessive in many cases. They also show that the KGmax given by the second level-1 method and by the level-2 method may be very similar. In some cases, the level-2 method can be more conservative than the second level-1 method, which is unanticipated by the future regulation. The KGmax curves associated with parametric roll confirm that the C11 container ship is vulnerable to this failure mode. The computation of the second check coefficient of parametric roll level 2 (C2) for all possible values of KG reveals the existence of both authorized and restricted areas on the surface formed by both the draft and KG, which may replace the classical KGmax curves. In consequence, it is not sufficient to check that C2 is lower than the maximum authorized value ( R PR0) for a fixed ship's loading condition.

  9. KGmax Curves Associated With Second Generation Intact Stability Criteria for Different Types of Ships

    Institute of Scientific and Technical Information of China (English)

    Francois Grinnaert; Jean-Yves Billard; Jean-Marc Laurens

    2016-01-01

    Currently, second generation intact stability criteria are being developed and evaluated by the International Maritime Organization (IMO). In this paper, we briefly present levels 1 and 2 assessment methods for the criteria of pure loss of stability and parametric roll failure modes. Subsequently, we show the KGmax curves associated with these criteria. We compute these curves for five different types of ships and compare them with the curves embodied in the current regulations. The results show that the safety margin ensured by the first level-1 method of calculation for both pure loss of stability and parametric roll seems to be excessive in many cases. They also show that the KGmax given by the second level-1 method and by the level-2 method may be very similar. In some cases, the level-2 method can be more conservative than the second level-1 method, which is unanticipated by the future regulation. The KGmax curves associated with parametric roll confirm that the C11 container ship is vulnerable to this failure mode. The computation of the second check coefficient of parametric roll level 2 (C2) for all possible values of KG reveals the existence of both authorized and restricted areas on the surface formed by both the draft and KG, which may replace the classical KGmax curves. In consequence, it is not sufficient to check that C2 is lower than the maximum authorized value (RPR0) for a fixed ship’s loading condition.

  10. Application of remote sensing and geographical information system for generation of runoff curve number

    Science.gov (United States)

    Meshram, S. Gajbhiye; Sharma, S. K.; Tignath, S.

    2017-07-01

    Watershed is an ideal unit for planning and management of land and water resources (Gajbhiye et al., IEEE international conference on advances in technology and engineering (ICATE), Bombay, vol 1, issue 9, pp 23-25, 2013a; Gajbhiye et al., Appl Water Sci 4(1):51-61, 2014a; Gajbhiye et al., J Geol Soc India (SCI-IF 0.596) 84(2):192-196, 2014b). This study aims to generate the curve number, using remote sensing and geographical information system (GIS) and the effect of slope on curve number values. The study was carried out in Kanhaiya Nala watershed located in Satna district of Madhya Pradesh. Soil map, Land Use/Land cover and slope map were generated in GIS Environment. The CN parameter values corresponding to various soil, land cover, and land management conditions were selected from Natural Resource Conservation Service (NRCS) standard table. Curve number (CN) is an index developed by the NRCS, to represent the potential for storm water runoff within a drainage area. The CN for a drainage basin is estimated using a combination of land use, soil, and antecedent soil moisture condition (AMC). In present study effect of slope on CN values were determined. The result showed that the CN unadjusted value are higher in comparison to CN adjusted with slope. Remote sensing and GIS is very reliable technique for the preparation of most of the input data required by the SCS curve number model.

  11. Minimization of Handoff Failure Probability for Next-Generation Wireless Systems

    CERN Document Server

    Sarddar, Debabrata; Saha, Souvik Kumar; Banerjee, Joydeep; Biswas, Utpal; Naskar, M K; 10.5121/ijngn.2010.2204

    2010-01-01

    During the past few years, advances in mobile communication theory have enabled the development and deployment of different wireless technologies, complementary to each other. Hence, their integration can realize a unified wireless system that has the best features of the individual networks. Next-Generation Wireless Systems (NGWS) integrate different wireless systems, each of which is optimized for some specific services and coverage area to provide ubiquitous communications to the mobile users. In this paper, we propose to enhance the handoff performance of mobile IP in wireless IP networks by reducing the false handoff probability in the NGWS handoff management protocol. Based on the information of false handoff probability, we analyze its effect on mobile speed and handoff signaling delay.

  12. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  13. Tourism and solid waste generation in Europe: A panel data assessment of the Environmental Kuznets Curve.

    Science.gov (United States)

    Arbulú, Italo; Lozano, Javier; Rey-Maquieira, Javier

    2015-12-01

    The relationship between tourism growth and municipal solid waste (MSW) generation has been, until now, the subject of little research. This is puzzling since the tourism sector is an important MSW generator and, at the same time, is willing to avoid negative impacts from MSW mismanagement. This paper aims to provide tools for tourism and MSW management by assessing the effects of tourism volume, tourism quality and tourism specialization on MSW generation in the UE. This is done using the Environmental Kuznets Curve (EKC) framework. The study considers a panel data for 32 European economies in the 1997-2010 periods. Empirical results support the EKC hypothesis for MSW and shows that northern countries tend to have lower income elasticity than less developed countries; furthermore, results confirm a non-linear and significant effect of tourism arrivals, expenditure per tourist and tourism specialization on MSW generation.

  14. Fortran code for generating random probability vectors, unitaries, and quantum states

    CERN Document Server

    Maziero, Jonas

    2015-01-01

    The usefulness of generating random configurations is recognized in a variety of contexts, as for instance in the simulation of physical systems, in the verification of bounds and/or ansatz solutions for optimization problems, and in secure communications. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And the several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  15. Practical Constraint K-Segment Principal Curve Algorithms for Generating Railway GPS Digital Map

    Directory of Open Access Journals (Sweden)

    Dewang Chen

    2013-01-01

    Full Text Available In order to obtain a decent trade-off between the low-cost, low-accuracy Global Positioning System (GPS receivers and the requirements of high-precision digital maps for modern railways, using the concept of constraint K-segment principal curves (CKPCS and the expert knowledge on railways, we propose three practical CKPCS generation algorithms with reduced computational complexity, and thereafter more suitable for engineering applications. The three algorithms are named ALLopt, MPMopt, and DCopt, in which ALLopt exploits global optimization and MPMopt and DCopt apply local optimization with different initial solutions. We compare the three practical algorithms according to their performance on average projection error, stability, and the fitness for simple and complex simulated trajectories with noise data. It is found that ALLopt only works well for simple curves and small data sets. The other two algorithms can work better for complex curves and large data sets. Moreover, MPMopt runs faster than DCopt, but DCopt can work better for some curves with cross points. The three algorithms are also applied in generating GPS digital maps for two railway GPS data sets measured in Qinghai-Tibet Railway (QTR. Similar results like the ones in synthetic data are obtained. Because the trajectory of a railway is relatively simple and straight, we conclude that MPMopt works best according to the comprehensive considerations on the speed of computation and the quality of generated CKPCS. MPMopt can be used to obtain some key points to represent a large amount of GPS data. Hence, it can greatly reduce the data storage requirements and increase the positioning speed for real-time digital map applications.

  16. Magnetization curves and probability angular distribution of the magnetization vector in Er2Fe14Si3

    Science.gov (United States)

    Sobh, Hala A.; Aly, Samy H.; Shabara, Reham M.; Yehia, Sherif

    2016-01-01

    Specific magnetic and magneto-thermal properties of Er2Fe14Si3, in the temperature range of 80-300 K, have been investigated using basic laws of classical statistical mechanics in a simple model. In this model, the constructed partition function was used to derive, and therefore calculate the temperature and/or field dependence of a host of physical properties. Examples of these properties are: the magnetization, magnetic heat capacity, magnetic susceptibility, probability angular distribution of the magnetization vector, and the associated angular dependence of energy. We highlight a correlation between the energy of the system, its magnetization behavior and the angular location of the magnetization vector. Our results show that Er2Fe14Si3 is an easy-axis system in the temperature range 80-114 K, but switches to an easy-plane system at T≥114 K. This transition is also supported by both of the temperature dependence of the magnetic heat capacity, which develops a peak at a temperature ~114 K, and the probability landscape which shows, in zero magnetic field, a prominent peak in the basal plane at T=113.5 K.

  17. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  18. A rapid, fully non-contact, hybrid system for generating Lamb wave dispersion curves.

    Science.gov (United States)

    Harb, M S; Yuan, F G

    2015-08-01

    A rapid, fully non-contact, hybrid system which encompasses an air-coupled transducer (ACT) and a laser Doppler vibrometer (LDV) is presented for profiling A0 Lamb wave dispersion of an isotropic aluminum plate. The ACT generates ultrasonic pressure incident upon the surface of the plate. The pressure waves are partially refracted into the plate. The LDV is employed to measure the out-of-plane velocity of the excited Lamb wave mode at some distances where the Lamb waves are formed in the plate. The influence of the ACT angle of incidence on Lamb wave excitation is investigated and Snell's law is used to directly compute Lamb wave dispersion curves including phase and group velocity dispersion curves in aluminum plates from incident angles found to generate optimal A0 Lamb wave mode. The measured curves are compared to results obtained from a two-dimensional (2-D) Fast Fourier transform (FFT), Morlet wavelet transform (MWT) and theoretical predictions. It was concluded that the experimental results obtained using Snell's law concept are well in accordance with the theoretical solutions. The high degree of accuracy in the measured data with the theoretical results proved a high sensitivity of the air-coupled and laser ultrasound in characterizing Lamb wave dispersion in plate-like structures. The proposed non-contact hybrid system can effectively characterize the dispersive relation without knowledge of neither the materials characteristics nor the mathematical model.

  19. A neural network driving curve generation method for the heavy-haul train

    Directory of Open Access Journals (Sweden)

    Youneng Huang

    2016-05-01

    Full Text Available The heavy-haul train has a series of characteristics, such as the locomotive traction properties, the longer length of train, and the nonlinear train pipe pressure during train braking. When the train is running on a continuous long and steep downgrade railway line, the safety of the train is ensured by cycle braking, which puts high demands on the driving skills of the driver. In this article, a driving curve generation method for the heavy-haul train based on a neural network is proposed. First, in order to describe the nonlinear characteristics of train braking, the neural network model is constructed and trained by practical driving data. In the neural network model, various nonlinear neurons are interconnected to work for information processing and transmission. The target value of train braking pressure reduction and release time is achieved by modeling the braking process. The equation of train motion is computed to obtain the driving curve. Finally, in four typical operation scenarios, comparing the curve data generated by the method with corresponding practical data of the Shuohuang heavy-haul railway line, the results show that the method is effective.

  20. Adaptive interpretation of gas well deliverability tests with generating data of the IPR curve

    Science.gov (United States)

    Sergeev, V. L.; Phuong, Nguyen T. H.; Krainov, A. I.

    2017-01-01

    The paper considers topical issues of improving accuracy of estimated parameters given by data obtained from gas well deliverability tests, decreasing test time, and reducing gas emissions into the atmosphere. The aim of the research is to develop the method of adaptive interpretation of gas well deliverability tests with a resulting IPR curve and using a technique of generating data, which allows taking into account additional a priori information, improving accuracy of determining formation pressure and flow coefficients, reducing test time. The present research is based on the previous theoretical and practical findings in the spheres of gas well deliverability tests, systems analysis, system identification, function optimization and linear algebra. To test the method, the authors used the field data of deliverability tests of two wells, run in the Urengoy gas and condensate field, Tyumen Oblast. The authors suggest the method of adaptive interpretation of gas well deliverability tests with the resulting IPR curve and the possibility of generating data of bottomhole pressure and a flow rate at different test stages. The suggested method allows defining the estimates of the formation pressure and flow coefficients, optimal in terms of preassigned measures of quality, and setting the adequate number of test stages in the course of well testing. The case study of IPR curve data processing has indicated that adaptive interpretation provides more accurate estimates on the formation pressure and flow coefficients, as well as reduces the number of test stages.

  1. H2: entanglement, probability density function, confined Kratzer oscillator, universal potential and (Mexican hat- or bell-type) potential energy curves

    CERN Document Server

    Van Hooydonk, G

    2011-01-01

    We review harmonic oscillator theory for closed, stable quantum systems. The H2 potential energy curve (PEC) of Mexican hat-type, calculated with a confined Kratzer oscillator, is better than the Rydberg-Klein-Rees (RKR) H2 PEC. Compared with QM, the theory of chemical bonding is simplified, since a confined Kratzer oscillator gives the long sought for universal function, once called the Holy Grail of Molecular Spectroscopy. This is validated with HF, I2, N2 and O2 PECs. We quantify the entanglement of spatially separated H2 quantum states, which gives a braid view. The equal probability for H2, originating either from HA+HB or HB+HA, is quantified with a Gauss probability density function. At the Bohr scale, confined harmonic oscillators behave properly at all extremes of bound two-nucleon quantum systems and are likely to be useful also at the nuclear scale.

  2. Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy

    Science.gov (United States)

    Ramanathan, Karthik Narayan

    Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations

  3. On the probability of exceeding allowable leak rates through degraded steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Cizelj, L.; Sorsek, I. [Jozef Stefan Institute, Ljubljana (Slovenia); Riesch-Oppermann, H. [Forschungszentrum Karlsruhe (Germany)

    1997-02-01

    This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds the predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.

  4. Probability of detection model for the non-destructive inspection of steam generator tubes of PWRs

    Science.gov (United States)

    Yusa, N.

    2017-06-01

    This study proposes a probability of detection (POD) model to discuss the capability of non-destructive testing methods for the detection of stress corrosion cracks appearing in the steam generator tubes of pressurized water reactors. Three-dimensional finite element simulations were conducted to evaluate eddy current signals due to stress corrosion cracks. The simulations consider an absolute type pancake probe and model a stress corrosion crack as a region with a certain electrical conductivity inside to account for eddy currents flowing across a flaw. The probabilistic nature of a non-destructive test is simulated by varying the electrical conductivity of the modelled stress corrosion cracking. A two-dimensional POD model, which provides the POD as a function of the depth and length of a flaw, is presented together with a conventional POD model characterizing a flaw using a single parameter. The effect of the number of the samples on the PODs is also discussed.

  5. Wind Turbine Power Curve Design for Optimal Power Generation in Wind Farms Considering Wake Effect

    Directory of Open Access Journals (Sweden)

    Jie Tian

    2017-03-01

    Full Text Available In modern wind farms, maximum power point tracking (MPPT is widely implemented. Using the MPPT method, each individual wind turbine is controlled by its pitch angle and tip speed ratio to generate the maximum active power. In a wind farm, the upstream wind turbine may cause power loss to its downstream wind turbines due to the wake effect. According to the wake model, downstream power loss is also determined by the pitch angle and tip speed ratio of the upstream wind turbine. By optimizing the pitch angle and tip speed ratio of each wind turbine, the total active power of the wind farm can be increased. In this paper, the optimal pitch angle and tip speed ratio are selected for each wind turbine by the exhausted search. Considering the estimation error of the wake model, a solution to implement the optimized pitch angle and tip speed ratio is proposed, which is to generate the optimal control curves for each individual wind turbine off-line. In typical wind farms with regular layout, based on the detailed analysis of the influence of pitch angle and tip speed ratio on the total active power of the wind farm by the exhausted search, the optimization is simplified with the reduced computation complexity. By using the optimized control curves, the annual energy production (AEP is increased by 1.03% compared to using the MPPT method in a case-study of a typical eighty-turbine wind farm.

  6. A generative probability model of joint label fusion for multi-atlas based brain segmentation.

    Science.gov (United States)

    Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang

    2014-08-01

    Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling

  7. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    Science.gov (United States)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  8. Unique heating curves generated by radiofrequency electric-field interactions with semi-aqueous solutions.

    Science.gov (United States)

    Lara, Nadia C; Haider, Asad A; Wilson, Lon J; Curley, Steven A; Corr, Stuart J

    2017-01-02

    Aqueous and nanoparticle-based solutions have been reported to heat when exposed to an alternating radiofrequency (RF) electric-field. Although the theoretical models have been developed to accurately model such a behavior given the solution composition as well as the geometrical constraints of the sample holder, these models have not been investigated across a wide-range of solutions where the dielectric properties differ, especially with regard to the real permittivity. In this work, we investigate the RF heating properties of non-aqueous solutions composed of ethanol, propylene glycol, and glycine betaine with and without varying amounts of NaCl and LiCl. This allowed us to modulate the real permittivity across the range 25-132, as well as the imaginary permittivity across the range 37-177. Our results are in excellent agreement with the previously developed theoretical models. We have shown that different materials generate unique RF heating curves that differ from the standard aqueous heating curves. The theoretical model previously described is robust and accounts for the RF heating behavior of materials with a variety of dielectric properties, which may provide applications in non-invasive RF cancer hyperthermia.

  9. Design Curve Generation for 3D SiC Fiber Architecture

    Science.gov (United States)

    Lang, Jerry; Dicarlo, James A.

    2014-01-01

    The design tool provides design curves that allow a simple and quick way to examine multiple factors that can influence the processing and key properties of the preforms and their final SiC-reinforced ceramic composites without over obligating financial capital for the fabricating of materials. Tool predictions for process and fiber fraction properties have been validated for a HNS 3D preform.The virtualization aspect of the tool will be used to provide a quick generation of solid models with actual fiber paths for finite element evaluation to predict mechanical and thermal properties of proposed composites as well as mechanical displacement behavior due to creep and stress relaxation to study load sharing characteristic between constitutes for better performance.Tool predictions for the fiber controlled properties of the SiCSiC CMC fabricated from the HNS preforms will be valuated and up-graded from the measurements on these CMC

  10. Pseudorandom Bit Sequence Generator for Stream Cipher Based on Elliptic Curves

    Directory of Open Access Journals (Sweden)

    Jilna Payingat

    2015-01-01

    Full Text Available This paper proposes a pseudorandom sequence generator for stream ciphers based on elliptic curves (EC. A detailed analysis of various EC based random number generators available in the literature is done and a new method is proposed such that it addresses the drawbacks of these schemes. Statistical analysis of the proposed method is carried out using the NIST (National Institute of Standards and Technology test suite and it is seen that the sequence exhibits good randomness properties. The linear complexity analysis shows that the system has a linear complexity equal to the period of the sequence which is highly desirable. The statistical complexity and security against known plain text attack are also analysed. A comparison of the proposed method with other EC based schemes is done in terms of throughput, periodicity, and security, and the proposed method outperforms the methods in the literature. For resource constrained applications where a highly secure key exchange is essential, the proposed method provides a good option for encryption by time sharing the point multiplication unit for EC based key exchange. The algorithm and architecture for implementation are developed in such a way that the hardware consumed in addition to point multiplication unit is much less.

  11. Quality control procedures for dose-response curve generation using nanoliter dispense technologies.

    Science.gov (United States)

    Quintero, Catherine; Rosenstein, Craig; Hughes, Bethany; Middleton, Richard; Kariv, Ilona

    2007-09-01

    With the advancement of high-throughput biomolecular screening techniques to the lead optimization stage, there is a critical need to quality control (QC) dose-response curves generated by robotic liquid handlers to ensure accurate affinity determinations. One challenge in evaluating the performance of liquid handlers is identifying and validating a robust method for testing dispense volumes across different instruments. Although traditional automated liquid handlers are still considered the standard platform in many laboratories, nanoliter dispensers are becoming more common and pose new challenges for routine quality control procedures. For example, standard gravimetric measurements are unreliable for testing the accuracy of nanoliter liquid dispenses. However, nanoliter dispensing technology allows for the conservation of compound, reduces compound carryover from well to well through discrete dispenses, and eliminates the need for intermediate compound dilution steps to achieve a low final DMSO assay concentration. Moreover, an intermediate dilution step in aqueous solution might result in compound precipitation at high concentrations. This study compared representative automation procedures done on a variety of liquid dispensers, including manual, traditional, and nanodispense volumes. The data confirmed the importance of establishing robust QC procedures for dose-response generation in addition to accuracy and precision determinations for each instrument, and they validated the use of nanoliter pipettors for dose-response testing. The results of this study also support the requirement for thorough mixing during serial compound dilutions prepared for high-throughput lead optimization strategies using traditional liquid handlers.

  12. Cyclic Fatigue Resistance and Force Generated by OneShape Instruments during Curved Canal Preparation

    Science.gov (United States)

    Zhang, Xiaolei

    2016-01-01

    Objectives To evaluate the cyclic fatigue resistance and the force generated by OneShape files during preparation of simulated curved canals. Methods Six OneShape files (the test) and six ProTaper F2 files (the control) were subject to the bending ability test. Another thirty files of each type were used to prepare artificial canals (n = 60), which were divided into 3 groups according to respective curvatures of the canals (30°, 60°, and 90°). The numbers of cycles to fatigue (NCF) as well as the positive and negative forces that were generated by files during canal preparation were recorded. The scanning electron microscopy was applied to detect the fracture surfaces. Results Compared with ProTaper F2 files, the bending loads of OneShape files were significantly lower at deflections of 45°(P ProTaper files in 30° canals. During the preparation of 30° canals by both files, the negative forces were dominant. With the increase of the curvature, more positive forces were observed. When the OneShape Files were compared with the control, significant different forces were found at D3 and D2 (P ProTaper F2 files. PMID:27513666

  13. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  14. Investigating Theoretical PV Energy Generation Patterns with Their Relation to the Power Load Curve in Poland

    Directory of Open Access Journals (Sweden)

    Jakub Jurasz

    2016-01-01

    Full Text Available Polish energy sector is (almost from its origin dominated by fossil fuel feed power. This situation results from an abundance of relatively cheap coal (hard and lignite. Brown coal due to its nature is the cheapest energy source in Poland. However, hard coal which fuels 60% of polish power plants is picking up on prices and is susceptible to the coal imported from neighboring countries. Forced by the European Union (EU regulations, Poland is struggling at achieving its goal of reaching 15% of energy consumption from renewable energy sources (RES by 2020. Over the year 2015, RES covered 11.3% of gross energy consumption but this generation was dominated by solid biomass (over 80%. The aim of this paper was to answer the following research questions: What is the relation of irradiation values to the power load on a yearly and daily basis? and how should photovoltaics (PV be integrated in the polish power system? Conducted analysis allowed us to state that there exists a negative correlation between power demand and irradiation values on a yearly basis, but this is likely to change in the future. Secondly, on average, daily values of irradiation tend to follow power load curve over the first hours of the day.

  15. Making Heads or Tails of Probability: An Experiment with Random Generators

    Science.gov (United States)

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  16. Generation of Transition Probability Data: Can quantity and quality be balanced?

    Science.gov (United States)

    Curry, J. J.; Froese Fisher, C.

    2008-10-01

    The possibility of truly predictive plasma modeling rests on the availability of large quantities of accurate atomic and molecular data. These include a variety of collision cross-sections and radiative transition data. An example of current interest concerns radiative transition probabilities for neutral Ce, an additive in highly-efficient metal-halide lamps. Transition probabilities have been measured for several hundred lines (Bisson et al., JOSA B 12, 193, 1995 and Lawler et al., unpublished), but the number of observed and classified transitions in the range of 340 nm to 1 μm is in excess of 21,000 (Martin, unpublished). Since the prospect for measuring more than a thousand or so of these transitions is rather low, an important question is whether calculation can adequately fill the void. In this case, we are interested only in electric dipole transitions. Furthermore, we require only that the transition probabilities have an average accuracy of ˜20%. We will discuss our efforts to calculate a comprehensive set of transition probabilities for neutral Ce using the Cowan (The Theory of Atomic Structure and Spectra, 1981) and GRASP (J"onsson et al. Comput. Phys. Commun. 176, 559-579, 2007) codes. We will also discuss our efforts to quantify the accuracy of the results.

  17. Unconventional construction of the brushless homopolar generator probably reveals a predictive weakness in Maxwell's equations

    CERN Document Server

    Ivana, Pavol; Ivanova, Marika

    2016-01-01

    Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does not induce any voltage. A new description of the operation of Faraday (with brushes) homopolar generator is here presented such as equipment, which simulates necessary and sufficient condition for the formation of the induction. However, the without brush homopolar meets only a necessary condition, but not sufficient. This article includes a mathematical analysis that shows the current differential concept of the rotation intensity vector creation as an incorrect theoretical mission with minimal impact on the design of kno...

  18. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    Science.gov (United States)

    Hu, Yaogang; Li, Hui; Liao, Xinglin; Song, Erbing; Liu, Haitao; Chen, Z.

    2016-08-01

    This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.

  19. Scale-wise evolution of rainfall probability density functions fingerprints the rainfall generation mechanism

    Science.gov (United States)

    Molini, Annalisa; Katul, Gabriel; Porporato, Amilcare

    2010-05-01

    Possible linkages between climatic fluctuations in rainfall at low frequencies and local intensity fluctuations within single storms is now receiving significant attention in climate change research. To progress on a narrower scope of this problem, the cross-scale probabilistic structure of rainfall intensity records collected over time scales ranging from hours to decades at sites dominated by either convective or frontal systems is investigated. Across these sites, intermittency buildup from slow to fast time-scales is analyzed in terms of its heavy tailed and asymmetric signatures in the scale-wise evolution of rainfall probability density functions (pdfs). The analysis demonstrates that rainfall records dominated by convective storms develop heavier-tailed power law pdfs across finer scales when compared with their frontal systems counterpart. A concomitant marked asymmetry buildup also emerges across finer time scales necessitating skewed probability laws for quantifying the scale-wise evolution of rainfall pdfs. A scale-dependent probabilistic description of such fat tails, peakedness and asymmetry appearance is proposed and tested by using a modified q-Gaussian model, able to describe the scale wise evolution of rainfall pdfs in terms of the nonextensivity parameter q, a lacunarity (intermittency) correction γ and a tail asymmetry coefficient c, also functions of q.

  20. Estimating Route Choice Models from Stochastically Generated Choice Sets on Large-Scale Networks Correcting for Unequal Sampling Probability

    DEFF Research Database (Denmark)

    Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo

    2015-01-01

    is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...... extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... collected in Cagliari, Italy, shows the feasibility of generating routes stochastically in a high-resolution network and calculating the correction factor. The model estimation with and without correction illustrates how the correction not only improves the goodness of fit but also turns illogical signs...

  1. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Hennawi, Joseph F. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); McMahon, Richard G. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Schiminovich, David [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Sheldon, Erin S. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Brinkmann, Jon [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Schneider, Donald P., E-mail: jo.bovy@nyu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States)

    2012-04-10

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques-which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data-and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  2. Photometric redshifts and quasar probabilities from a single, data-driven generative model

    Energy Technology Data Exchange (ETDEWEB)

    Bovy, Jo [New York Univ. (NYU), NY (United States); Myers, Adam D. [Univ. of Wyoming, Laramie, WY (United States); Max Planck Inst. for Medical Research, Heidelberg (Germany); Hennawi, Joseph F. [Max Planck Inst. for Medical Research, Heidelberg (Germany); Hogg, David W. [Max Planck Inst. for Medical Research, Heidelberg (Germany); New York Univ. (NYU), NY (United States); McMahon, Richard G. [Univ. of Cambridge (United Kingdom); Schiminovich, David [Columbia Univ., New York, NY (United States); Sheldon, Erin S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brinkmann, Jon [Apache Point Observatory and New Mexico State Univ., Sunspot, NM (United States); Schneider, Donald P. [Pennsylvania State Univ., University Park, PA (United States); Weaver, Benjamin A. [New York Univ. (NYU), NY (United States)

    2012-03-20

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  3. Fast dose algorithm for generation of dose coverage probability for robustness analysis of fractionated radiotherapy

    Science.gov (United States)

    Tilly, David; Ahnesjö, Anders

    2015-07-01

    A fast algorithm is constructed to facilitate dose calculation for a large number of randomly sampled treatment scenarios, each representing a possible realisation of a full treatment with geometric, fraction specific displacements for an arbitrary number of fractions. The algorithm is applied to construct a dose volume coverage probability map (DVCM) based on dose calculated for several hundred treatment scenarios to enable the probabilistic evaluation of a treatment plan. For each treatment scenario, the algorithm calculates the total dose by perturbing a pre-calculated dose, separately for the primary and scatter dose components, for the nominal conditions. The ratio of the scenario specific accumulated fluence, and the average fluence for an infinite number of fractions is used to perturb the pre-calculated dose. Irregularities in the accumulated fluence may cause numerical instabilities in the ratio, which is mitigated by regularisation through convolution with a dose pencil kernel. Compared to full dose calculations the algorithm demonstrates a speedup factor of ~1000. The comparisons to full calculations show a 99% gamma index (2%/2 mm) pass rate for a single highly modulated beam in a virtual water phantom subject to setup errors during five fractions. The gamma comparison shows a 100% pass rate in a moving tumour irradiated by a single beam in a lung-like virtual phantom. DVCM iso-probability lines computed with the fast algorithm, and with full dose calculation for each of the fractions, for a hypo-fractionated prostate case treated with rotational arc therapy treatment were almost indistinguishable.

  4. Using Probability of Exceedance to Compare the Resource Risk of Renewable and Gas-Fired Generation

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-08-01

    Of the myriad risks surrounding long-term investments in power plants, resource risk is one of the most difficult to mitigate, and is also perhaps the risk that most-clearly distinguishes renewable generation from natural gas-fired generation. For renewable generators like wind and solar projects, resource risk manifests as a quantity risk—i.e., the risk that the quantity of wind and insolation will be less than expected.i For gas-fired generators (i.e., a combined-cycle gas turbine or “CCGT”), resource risk manifests primarily as a price risk—i.e., the risk that natural gas will cost more than expected. Most often, resource risk—and natural gas price risk in particular—falls disproportionately on utility ratepayers, who are typically not well-equipped to manage this risk. As such, it is incumbent upon utilities, regulators, and policymakers to ensure that resource risk is taken into consideration when making or approving resource decisions, or enacting policies that influence the development of the electricity sector more broadly.

  5. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  6. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X;

    2016-01-01

    This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration......-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation...... method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...

  7. Experimental Generation of the Intersection Curve of Two Cylinders: An Algorithm Based on a New Paradigm

    Directory of Open Access Journals (Sweden)

    Ioan Halalae

    2015-07-01

    Full Text Available Computational Geometry is currently using approximation techniques based on convex polygons, with good results in some topics, but with severe limitation of applicability. In this paper we present the first step of a larger project, aiming to lead us to the plane projection of a 3D surface obtained by intersecting two cylinders (a very frequent problem in obtaining the stencils for welded ensembles. As a first step, we present an algorithm for experimentally obtaining the intersection curve of two cylinders. The new paradigm and also the fundamentally new aspect of our project is that we construct our algorithm by simulation of the analytical relations describing the curve, and not by approximation using convex polygons.

  8. Influence of disorder on generation and probability of extreme events in Salerno lattices

    Science.gov (United States)

    Mančić, A.; Maluckov, A.; Hadžievski, Lj.

    2017-03-01

    Extreme events (EEs) in nonlinear and/or disordered one-dimensional photonic lattice systems described by the Salerno model with on-site disorder are studied. The goal is to explain particular properties of these phenomena, essentially related to localization of light in the presence of nonlinear and/or nonlocal couplings in the considered systems. Combining statistical and nonlinear dynamical methods and measures developed in the framework of the theory of localization phenomena in disordered and nonlinear systems, particularities of EEs are qualitatively clarified. Findings presented here indicate that the best environment for EEs' creation are disordered near-integrable Salerno lattices. In addition, it is been shown that the leading role in the generation and dynamical properties of EEs in the considered model is played by modulation instability, i.e., by nonlinearities in the system, although EEs can be induced in linear lattices with on-site disorder too.

  9. Generation of a Sediment Rating and Load Curve Demonstrated at the Mackinaw River Confluence

    Science.gov (United States)

    2016-12-01

    Demonstrated at the Mackinaw River Confluence by Jeremy A. Sharp and Ronald E. Heath PURPOSE: This Coastal and Hydraulics Engineering Technical Note...the gage), flood flow frequency analysis, and normal depth computations. These data are calculated using the collected data. Once gathered, the...have a long (at least 20 years) period of record. This provides the means to calculate the flood flow frequency curve. Additional collected data are

  10. Phase-transient hierarchical turbulence as an energy correlation generator of blazar light curves

    CERN Document Server

    Honda, Mitsuru

    2008-01-01

    Hierarchical turbulent structure constituting a jet is considered to reproduce energy-dependent variability in blazars, particularly, the correlation between X- and gamma-ray light curves measured in the TeV blazar Markarian 421. The scale-invariant filaments are featured by the ordered magnetic fields that involve hydromagnetic fluctuations serving as electron scatterers for diffusive shock acceleration, and the spatial size scales are identified with the local maximum electron energies, which are reflected in the synchrotron spectral energy distribution (SED) above the near-infrared/optical break. The structural transition of filaments is found to be responsible for the observed change of spectral hysteresis.

  11. Influence of design and mode parameters on pump performance curve of heat generating aggregate

    Science.gov (United States)

    Barykin, O.; Kovalyov, S.; Ovcharenko, M.; Papchenko, A.

    2017-08-01

    Classification of multi-functional heat generating aggregates according to the function is considered in this article. Analysis of operating process mathematical model was implemented and methods for its refinement were proposed. Results of physical investigation of heat generating aggregate design and mode parameters influence on its power and head were presented.

  12. Relay protection coordination with generator capability curve, excitation system limiters and power system relay protections settings

    Directory of Open Access Journals (Sweden)

    Buha Danilo

    2016-01-01

    Full Text Available The relay protection settings performed in the largest thermal powerplant (TE "Nikola Tesla B" are reffered and explained in this paper. The first calculation step is related to the coordination of the maximum stator current limiter settings, the overcurrent protection with inverse characteristics settings and the permitted overload of the generator stator B1. In the second calculation step the settings of impedance generator protection are determined, and the methods and criteria according to which the calculations are done are described. Criteria used to provide the protection to fulfill the backup protection role in the event of malfunction of the main protection of the transmission system. are clarified. The calculation of all protection functions (32 functions of generator B1 were performed in the project "Coordination of relay protection blocks B1 and B2 with the system of excitation and power system protections -TENT B".

  13. The application of numerical debris flow modelling for the generation of physical vulnerability curves

    Directory of Open Access Journals (Sweden)

    B. Quan Luna

    2011-07-01

    Full Text Available For a quantitative assessment of debris flow risk, it is essential to consider not only the hazardous process itself but also to perform an analysis of its consequences. This should include the estimation of the expected monetary losses as the product of the hazard with a given magnitude and the vulnerability of the elements exposed. A quantifiable integrated approach of both hazard and vulnerability is becoming a required practice in risk reduction management. This study aims at developing physical vulnerability curves for debris flows through the use of a dynamic run-out model. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures and to determine the zones where the elements at risk could suffer an impact. These results can then be applied to consequence analyses and risk calculations. On 13 July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of the Valtellina Valley (Lombardy Region, Northern Italy. One of the largest debris flows events occurred in a village called Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. The Selvetta event was modelled with the FLO-2D program, an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The behaviour and run-out of the flow was reconstructed. The significance of calculated values of the flow depth, velocity, and pressure were investigated in terms of the resulting damage to the affected buildings. The physical damage was quantified for each affected structure within the context of physical vulnerability, which was calculated as the ratio between the monetary loss and the reconstruction value. Three

  14. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  15. Wind Turbine Power Curve Design for Optimal Power Generation in Wind Farms Considering Wake Effect

    DEFF Research Database (Denmark)

    Tian, Jie; Zhou, Dao; Su, Chi

    2017-01-01

    In modern wind farms, maximum power point tracking (MPPT) is widely implemented. Using the MPPT method, each individual wind turbine is controlled by its pitch angle and tip speed ratio to generate the maximum active power. In a wind farm, the upstream wind turbine may cause power loss to its...... downstream wind turbines due to the wake effect. According to the wake model, downstream power loss is also determined by the pitch angle and tip speed ratio of the upstream wind turbine. By optimizing the pitch angle and tip speed ratio of each wind turbine, the total active power of the wind farm can...... be increased. In this paper, the optimal pitch angle and tip speed ratio are selected for each wind turbine by the exhausted search. Considering the estimation error of the wake model, a solution to implement the optimized pitch angle and tip speed ratio is proposed, which is to generate the optimal control...

  16. A unified approach for a posteriori high-order curved mesh generation using solid mechanics

    Science.gov (United States)

    Poya, Roman; Sevilla, Ruben; Gil, Antonio J.

    2016-09-01

    The paper presents a unified approach for the a posteriori generation of arbitrary high-order curvilinear meshes via a solid mechanics analogy. The approach encompasses a variety of methodologies, ranging from the popular incremental linear elastic approach to very sophisticated non-linear elasticity. In addition, an intermediate consistent incrementally linearised approach is also presented and applied for the first time in this context. Utilising a consistent derivation from energy principles, a theoretical comparison of the various approaches is presented which enables a detailed discussion regarding the material characterisation (calibration) employed for the different solid mechanics formulations. Five independent quality measures are proposed and their relations with existing quality indicators, used in the context of a posteriori mesh generation, are discussed. Finally, a comprehensive range of numerical examples, both in two and three dimensions, including challenging geometries of interest to the solids, fluids and electromagnetics communities, are shown in order to illustrate and thoroughly compare the performance of the different methodologies. This comparison considers the influence of material parameters and number of load increments on the quality of the generated high-order mesh, overall computational cost and, crucially, the approximation properties of the resulting mesh when considering an isoparametric finite element formulation.

  17. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  18. Generating a learning curve for pediatric caudal epidural blocks: an empirical evaluation of technical skills in novice and experienced anesthetists.

    Science.gov (United States)

    Schuepfer, G; Konrad, C; Schmeck, J; Poortmans, G; Staffelbach, B; Jöhr, M

    2000-01-01

    Learning curves for anesthesia procedures in adult patients have been determined, but no data are available on procedures in pediatric anesthesia. The aim of this study was to assess the number of caudal blocks needed to guarantee a high success rate in performing caudal epidural analgesia in children. At a teaching hospital, the technical skills of 7 residents in anesthesiology who performed caudal blocks were evaluated during 4 months using a standardized self-evaluation questionnaire. At the start of the study period, the residents had no prior experience in pediatric anesthesia or in performing caudal epidural blocks. All residents entered the pediatric rotation after a minimum of 1 year of training in adult general and regional anesthesia. The blocks were rated using a binary score. For comparison, the success rates of 8 experienced staff anesthesiologists were collected during the same period using the same self-evaluation questionnaire. Statistical analyses were performed by generating individual and institutional learning curves using the pooled data. The learning curves were calculated with the aid of a least-square fit model and 95% confidence intervals were estimated by a Monte Carlo procedure with a bootstrap technique. The success rate of residents was 80% after 32 procedures (95% confidence interval of 0.59 to 1.00). The pooled success rate of the staff anesthesiologists was 0.73 (mean) with a standard deviation of 0.45, which was not statistically different from the success rate of the residents. High success rates in performing caudal anesthesia in pediatric patients can be acquired after a limited number of cases. Success rates of residents learning this procedure are comparable to the results of staff anesthesiologists.

  19. Fundamental investigations of natural and laboratory generated SAR dose response curves for quartz OSL in the high dose range

    DEFF Research Database (Denmark)

    Timar-Gabor, Alida; Constantin, Daniela; Buylaert, Jan-Pieter

    2015-01-01

    SAR-OSL investigations on quartz from Romanian loess resulted in non concordant fine and coarse-grain ages for equivalent doses higher than ~100 Gy. The laboratory dose response for both grain sizes is well represented by a sum of two saturating exponential functions, fine and coarse grains chara...... equivalent dose of 2000e2500 Gy were found to be below the saturation level of the laboratory dose response curve for both grain sizes; this also applied to the luminescence signals measured after >5000 Gy given on top of natural doses. © 2015 Elsevier Ltd. All rights reserved....... characterised by D01 and D02 values of ~140 and ~1400 Gy and ~65 and ~650 Gy respectively. Pulsed OSL experiments confirmed that this behaviour is almost certainly inherent to quartz and not caused by contamination with another mineral. Natural doseeresponse curves do not follow the same pattern and enter...... saturation much earlier. Analysis of time resolved spectra indicated similar luminescence lifetimes for both fine and coarse quartz grains, and natural and laboratory generated OSL signals seem to use the same non-dosedependent recombination pathways. The natural signals of a sample with an expected...

  20. Aspectos genéticos de curvas de probabilidade de postura em codornas Genetic aspects of laying probability curves in quails

    Directory of Open Access Journals (Sweden)

    Robson Marcelo Rossi

    2010-08-01

    Full Text Available Neste trabalho foram avaliados os componentes de covariância e as herdabilidades dos parâmetros da curva de produção de ovos em codornas em um modelo de duplo estágio: o primeiro estágio composto pelo ajustamento de uma curva não-linear e o segundo, pela avaliação dos parâmetros genéticos obtidos por meio do modelo animal. Foram utilizados registros individuais diários de postura até 90 dias, a contar do primeiro ovo no lote, de 308, 374 e 378 aves de três linhagens. Dentro de cada linhagem, não houve diferenças entre os componentes de covariância nem herdabilidades para as rações contendo diferentes níveis de energia (2.900 ou 2.500 kcal/kg de EM, indicando que não houve heterogeneidade de variância. Entretanto, os valores de correlações genéticas foram na ordem de 0,53 a 0,65 para o parâmetro α e de 0,28 a 0,30 para β, indicando haver interação genótipo × ambiente. As estimativas de herdabilidade, respectivamente, nas três linhagens, para os parâmetros α e β na dieta de alta energia foram 0,21 e 0,50; 0,12 e 0,49; e 0,12 e 0,48, e na dieta de baixa energia, 0,23 e 0,50; 0,13 e 0,50; e 0,10 e 0,47. As estimativas dos componentes de covariância e herdabilidades foram diferentes entre as linhagens, para os dois parâmetros da curva. Uma linhagem apresentou maior herdabilidade para o parâmetro α, o que indica maior potencial para mudanças por seleção para a produção na fase inicial de postura. Considerando o parâmetro β, as três linhagens apresentam o mesmo potencial.It was evaluated in this work the components of (covariance and heritability of the parameters of egg production curve in quails in a model of double stages: the first stage is composed of the adjustment of a nonlinear curve, and the second is composed of the assessing of the genetic parameters obtained by the animal model. It was used daily individual records from laying to 90 days, starting with the first egg in the batch, of 308, 374

  1. Expected utility versus expected regret theory versions of decision curve analysis do generate different results when treatment effects are taken into account.

    Science.gov (United States)

    Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin

    2016-12-15

    Decision curve analysis (DCA) is a widely used method for evaluating diagnostic tests and predictive models. It was developed based on expected utility theory (EUT) and has been reformulated using expected regret theory (ERG). Under certain circumstances, these 2 formulations yield different results. Here we describe these situations and explain the variation. We compare the derivations of the EUT- and ERG-based formulations of DCA for a typical medical decision problem: "treat none," "treat all," or "use model" to guide treatment. We illustrate the differences between the 2 formulations when applied to the following clinical question: at which probability of death we should refer a terminally ill patient to hospice? Both DCA formulations yielded identical but mirrored results when treatment effects are ignored; they generated significantly different results otherwise. Treatment effect has a significant effect on the results derived by EUT DCA and less so on ERG DCA. The elicitation of specific values for disutilities affected the results even more significantly in the context of EUT DCA, whereas no such elicitation was required within the ERG framework. EUT and ERG DCA generate different results when treatment effects are taken into account. The magnitude of the difference depends on the effect of treatment and the disutilities associated with disease and treatment effects. This is important to realize as the current practice guidelines are uniformly based on EUT; the same recommendations can significantly differ if they are derived based on ERG framework. © 2016 The Authors. Journal of Evaluation in Clinical Practice Published by John Wiley & Sons Ltd.

  2. Heat transfer and pressure drop characteristics of the tube bank fin heat exchanger with fin punched with flow redistributors and curved triangular vortex generators

    Science.gov (United States)

    Liu, Song; Jin, Hua; Song, KeWei; Wang, LiangChen; Wu, Xiang; Wang, LiangBi

    2017-10-01

    The heat transfer performance of the tube bank fin heat exchanger is limited by the air-side thermal resistance. Thus, enhancing the air-side heat transfer is an effective method to improve the performance of the heat exchanger. A new fin pattern with flow redistributors and curved triangular vortex generators is experimentally studied in this paper. The effects of the flow redistributors located in front of the tube stagnation point and the curved vortex generators located around the tube on the characteristics of heat transfer and pressure drop are discussed in detail. A performance comparison is also carried out between the fins with and without flow redistributors. The experimental results show that the flow redistributors stamped out from the fin in front of the tube stagnation points can decrease the friction factor at the cost of decreasing the heat transfer performance. Whether the combination of the flow redistributors and the curved vortex generators will present a better heat transfer performance depends on the size of the curved vortex generators. As for the studied two sizes of vortex generators, the heat transfer performance is promoted by the flow redistributors for the fin with larger size of vortex generators and the performance is suppressed by the flow redistributors for the fin with smaller vortex generators.

  3. Doubly-Fed Induction Generator Drive System Based on Maximum Power Curve Searching using Fuzzy Logic Controller

    Directory of Open Access Journals (Sweden)

    Abdelhak Dida

    2015-02-01

    Full Text Available This paper proposes a novel variable speed control algorithm for a grid connected doubly-fed induction generator (DFIG system. The main objective is to track the maximum power curve characteristic by using an adaptive fuzzy logic controller, and to compare it with the conventional optimal torque control method for large inertia wind turbines. The role of the FLC is to adapt the transfer function of the harvested mechanical power controller according to the operating point in variable wind speed.  The control system has two sub-systems for the rotor side and the grid side converters (RSC, GSC. Active and reactive power control of the back-to-back converters has been achieved indirectly by controlling q-axis and d-axis current components. The main function of the RSC controllers is to track the maximum power through controlling the electromagnetic torque of the wind turbine. The GSC controls the DC-link voltage, and guarantees unity power factor between the GSC and the grid. The proposed system is developed and tested in MATLAB/SimPowerSystem (SPS environment.

  4. Automatable on-line generation of calibration curves and standard additions in solution-cathode glow discharge optical emission spectrometry

    Science.gov (United States)

    Schwartz, Andrew J.; Ray, Steven J.; Hieftje, Gary M.

    2015-03-01

    Two methods are described that enable on-line generation of calibration standards and standard additions in solution-cathode glow discharge optical emission spectrometry (SCGD-OES). The first method employs a gradient high-performance liquid chromatography pump to perform on-line mixing and delivery of a stock standard, sample solution, and diluent to achieve a desired solution composition. The second method makes use of a simpler system of three peristaltic pumps to perform the same function of on-line solution mixing. Both methods can be computer-controlled and automated, and thereby enable both simple and standard-addition calibrations to be rapidly performed on-line. Performance of the on-line approaches is shown to be comparable to that of traditional methods of sample preparation, in terms of calibration curves, signal stability, accuracy, and limits of detection. Potential drawbacks to the on-line procedures include signal lag between changes in solution composition and pump-induced multiplicative noise. Though the new on-line methods were applied here to SCGD-OES to improve sample throughput, they are not limited in application to only SCGD-OES-any instrument that samples from flowing solution streams (flame atomic absorption spectrometry, ICP-OES, ICP-mass spectrometry, etc.) could benefit from them.

  5. J-resistance curves for Inconel 690 and Incoloy 800 nuclear steam generators tubes at room temperature and at 300 °C

    Science.gov (United States)

    Bergant, Marcos A.; Yawny, Alejandro A.; Perez Ipiña, Juan E.

    2017-04-01

    The structural integrity of steam generator tubes is a relevant issue concerning nuclear plant safety. In the present work, J-resistance curves of Inconel 690 and Incoloy 800 nuclear steam generator tubes with circumferential and longitudinal through wall cracks were obtained at room temperature and 300 °C using recently developed non-standard specimens' geometries. It was found that Incoloy 800 tubes exhibited higher J-resistance curves than Inconel 690 for both crack orientations. For both materials, circumferential cracks resulted into higher fracture resistance than longitudinal cracks, indicating a certain degree of texture anisotropy introduced by the tube fabrication process. From a practical point of view, temperature effects have found to be negligible in all cases. The results obtained in the present work provide a general framework for further application to structural integrity assessments of cracked tubes in a variety of nuclear steam generator designs.

  6. Determination of PV Generator I-V/P-V Characteristic Curves Using a DC-DC Converter Controlled by a Virtual Instrument

    Directory of Open Access Journals (Sweden)

    E. Durán

    2012-01-01

    Full Text Available A versatile measurement system for systematic testing and measurement of the evolution of the I-V characteristic curves of photovoltaic panels or arrays (PV generators is proposed in this paper. The measurement system uses a circuit solution based on DC-DC converters that involves several advantages relative to traditional methods: simple structure, scalability, fast response, and low cost. The measurement of the desired characteristics of PV generators includes high speed of response and high fidelity. The prototype system built is governed by a microcontroller, and experimental results prove the proposed measurement system useful. A virtual instrument (VI was developed for full system control from a computer. The developed system enables monitoring the suitable operation of a PV generator in real time, since it allows comparing its actual curves with those provided by the manufacturer.

  7. Application of dissociation curve analysis to radiation hybrid panel marker scoring: generation of a map of river buffalo (B. bubalis chromosome 20

    Directory of Open Access Journals (Sweden)

    Schäffer Alejandro A

    2008-11-01

    Full Text Available Abstract Background Fluorescence of dyes bound to double-stranded PCR products has been utilized extensively in various real-time quantitative PCR applications, including post-amplification dissociation curve analysis, or differentiation of amplicon length or sequence composition. Despite the current era of whole-genome sequencing, mapping tools such as radiation hybrid DNA panels remain useful aids for sequence assembly, focused resequencing efforts, and for building physical maps of species that have not yet been sequenced. For placement of specific, individual genes or markers on a map, low-throughput methods remain commonplace. Typically, PCR amplification of DNA from each panel cell line is followed by gel electrophoresis and scoring of each clone for the presence or absence of PCR product. To improve sensitivity and efficiency of radiation hybrid panel analysis in comparison to gel-based methods, we adapted fluorescence-based real-time PCR and dissociation curve analysis for use as a novel scoring method. Results As proof of principle for this dissociation curve method, we generated new maps of river buffalo (Bubalus bubalis chromosome 20 by both dissociation curve analysis and conventional marker scoring. We also obtained sequence data to augment dissociation curve results. Few genes have been previously mapped to buffalo chromosome 20, and sequence detail is limited, so 65 markers were screened from the orthologous chromosome of domestic cattle. Thirty bovine markers (46% were suitable as cross-species markers for dissociation curve analysis in the buffalo radiation hybrid panel under a standard protocol, compared to 25 markers suitable for conventional typing. Computational analysis placed 27 markers on a chromosome map generated by the new method, while the gel-based approach produced only 20 mapped markers. Among 19 markers common to both maps, the marker order on the map was maintained perfectly. Conclusion Dissociation curve

  8. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    CERN Document Server

    Lott, B; Larsson, S; Ballet, J

    2012-01-01

    We present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. This method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. The absence of major caveats associated with this method has been established by means of Monte-Carlo simulations. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  9. Simulating Supernova Light Curves

    Energy Technology Data Exchange (ETDEWEB)

    Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-05

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.

  10. An algorithm based on elliptic interpolation for generating random curves%一种基于椭圆插值的随机曲线生成算法

    Institute of Scientific and Technical Information of China (English)

    李玲; 魏玮

    2012-01-01

    在计算机图形学中,建立复杂的结构或形状的模型是一个核心问题.随机曲线的生成在计算机游戏,电影,建筑模型,城市规划和虚拟现实等领域中,也扮演着十分重要的角色.本文主要研究二值图像中随机产生曲线的算法,算法首先采用随机的方法产生初始点和终结点,再利用椭圆内随机插值的方法产生插值点,以新产生的点和其相邻点做为初始点与终结点,再利用椭圆内随机插值的方法产生新的插值点,依此类推最后得到由诸多插值点组成的整条曲线.该过程中为了保证曲线收敛,假设先产生的插值点对曲线的形成趋势影响大,后产生的随机点对曲线的形成趋势影响小.并且,在插值过程中,只对相邻插值点进行下一步插值.结合椭圆内产生的随机曲线的过程,使用Visual C++软件来实现随机曲线的生成算法并进行结果的详细分析,同时做相应的说明和结论来改善用户交互系统.%The building of geometric models with complex shapes and structures is one of the key issues in computer graphics. Random curves also play important roles in many domains such as computer games, movies, architectural models, urban planning, virtual reality etc. In this paper, we present a novel synthesis algorithm for procedurally generating random curves in binary image. The first step is to generate the starting point and ending point randomly; The second step is to generate the interpolative point by using the elliptic interpolation method; The third step is to generate the next new interpolative point by starting with the new point and ending with its adjacent point. These new points though quite few, finally constitute the whole curve. To ensure the convergence of the curve, we firstly assume that the earlier interpolative points have more impact on the trend of the curve, while the latter points has less impact. The second assumption is to generate new points only between the

  11. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  12. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E., E-mail: amsantos@cdtn.b [Center for Development of Nuclear Technology (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  13. Evaluation of the Plug-in Hybrid Electric Vehicle Considering Learning Curve on Battery and Power Generation Best Mix

    Science.gov (United States)

    Shinoda, Yukio; Tanaka, Hideo; Akisawa, Atsushi; Kashiwagi, Takao

    Plug-in Hybrid Electric Vehicle (PHEV) is one of the technologies to reduce amount of CO2 emissions in transport section. This paper presents one of the scenarios that shows how widely used the PHEVs will be in the future. And this paper also presents how amount of CO2 will be reduced by the introduction of PHEVs, and whether there are any serious effects on power supply system in those scenarios. PHEV can run with both gasoline and electricity. Therefore we evaluate CO2 emissions not only from gasoline consumption but also from electricity consumption. To consider a distribution of daily-trip-distance is important for evaluating the economical merit and CO2 emissions by introducing of PHEV. Also, the battery cost in the future is very important for making a PHEV's growth scenario. The growth of the number of PHEV makes battery cost lower. Then, we formulate the total model that combines passenger car sector and power supply sector with considering a distribution of daily-trip-distance and Learning Curve on battery costs. We use the iteration method to consider a Learning Curve that is non- linear. Therefore we set battery cost only in the first year of the simulation. Battery costs in the later year are calculated in the model. We focus on the 25-year time frame from 2010 in Japan, with divided in 5 terms (1st∼5th). And that model selects the most economical composition of car type and power sources.

  14. Changes in Sexual Behavior and Attitudes Across Generations and Gender Among a Population-Based Probability Sample From an Urbanizing Province in Thailand.

    Science.gov (United States)

    Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro

    2016-02-01

    Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.

  15. Quantifying the climate change-induced variations in Saskatoon's Intensity-Duration-Frequency curves using stochastic rainfall generators and K-nearest neighbors

    Science.gov (United States)

    Shahabul Alam, Md.; Nazemi, Alireza; Elshorbagy, Amin

    2014-05-01

    Intensity-Duration-Frequency (IDF) curves are among standard design criteria for various engineering applications, such as storm water management systems. Warming climate, however, changes the extreme rainfall quantiles represented by the IDF curves. This study attempts to construct the future IDF curves under possible climate change scenarios. For this purpose, a stochastic rainfall generator is used to spatially downscale the daily projections of Global Climate Models (GCMs) from coarse grid resolution to the point scale. The stochastically downscaled daily rainfall realizations can be further disaggregated to hourly and sub-hourly rainfall series using a deterministic disaggregation scheme developed based on the K-Nearest Neighbor (K-NN) method. We applied this framework for constructing the future IDF curves in the city of Saskatoon, Canada. As a model development step, the sensitivity of the K-NN disaggregation model to the number of nearest neighbors (i.e. window size) is evaluated during the baseline periods. The optimum window size is assigned based on the performance in reproducing the historical IDF curves. The optimum windows identified for 1-hour and 5-min temporal resolutions are then used to produce the future hourly and consequently, 5-min resolution rainfall based on the K-NN simulations. By using the simulated hourly and sub-hourly rainfall series and the Generalized Extreme Value (GEV) distribution future changes in IDF curves and associated uncertainties are quantified using a large ensemble of projections obtained for the CGCM3.1 and HadCM3 based on A1B, A2 and B1 emission scenarios in case of CMIP3 and RCP2.6, RCP4.5, and RCP8.5 in case of CMIP5 datasets. The constructed IDF curves for the city of Saskatoon are then compared with corresponding historical relationships at various durations and/or return periods and are discussed based on different models, emission scenarios and/or simulation release (i.e. CMIP3 vs. CMIP5).

  16. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  17. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  18. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  19. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  20. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  1. A proposal of the diagnosis-dynamic characteristic (DDC) model describing the relation between search time and confidence levels for a dichotomous judgment, and its application to ROC curve generation

    Science.gov (United States)

    Matsumoto, Toru; Fukuda, Nobuo; Furukawa, Akira; Suwa, Koji; Wada, Shinichi; Matsumoto, Mitsuomi; Sone, Shusuke

    2006-03-01

    When physicians inspect an image, they make up a certain degree of confidence that the image are abnormal; p(t), or normal; n(t)[n(t)=1-p(t)]. After infinite time of the inspection, they reach the equilibrium levels of the confidence of p*=p(∞) and n*=n(∞). There are psychological conflicts between the decisions of normal and abnormal. We assume that the decision of "normal" is distracted by the decision of "abnormal" by a factor of k(1 + ap), and in an inverse direction by a factor of k(1 + bn), where k ( > 0) is a parameter that relates with image quality and skill of the physicians, and a and b are unknown constants. After the infinite time of inspection, the conflict reaches the equilibrium, which satisfies the equation, k(1 + ap*)n* = k(1 + bn*)p*. Here we define a parameter C, which is 2p*/[p*(1 - p*)]. After the infinite time of inspection, the conflict reaches the equilibrium, which satisfies t that changes in the confidence level with the time (dp/dt) is proportional to [k(1+ap)n - k(1+bn)p], i.e. k[-cp2 + (c - 2)p + 1]. Solving the differential equation, we derived the equation; t(p) and p(t) depending with the parameters; k, c, S. S (0-1) is the value arbitrary selected and related with probability of "abnormal" before the image inspection (S = p(0)). Image reading studies were executed for CT images. ROC curves were generated both by the traditional 4-step score-based method and by the confidence level; p estimated from the equation t(p) of the DDC model using observed judgment time. It was concluded that ROC curves could be generated by measuring time for dichotomous judgment without the subjective scores of diagnostic confidence and applying the DDC model.

  2. Kill curve analysis and response of first generation Capsicum annuum L. B12 cultivar to ethyl methane sulfonate.

    Science.gov (United States)

    Arisha, M H; Liang, B-K; Muhammad Shah, S N; Gong, Z-H; Li, D-W

    2014-11-28

    Pepper seeds (Capsicum annuum L.) var. B12 were mutagenized by four presoaking treatments in ten concentrations of ethyl methane sulfonate (EMS) to determine the sensitivity of the first generation (M1) to mutagens. The spectrum of mutations and induced variability for various quantitative traits, including germination, percent plant height, injury occurrence, survival ratio, first three fruits weight, and number of seeds per first fruit, were observed in the M1 generation. Our results indicated that all of the test parameters decreased with increasing EMS concentration, except for seedling injury. There were significant differences in germination ratio, LD50, plant height, percent injury, and survival ratio among the tested presoaking treatment. The LD50 was 1% EMS in seeds that were not presoaked (T1) and seeds presoaked for 12 h before treating with EMS (T3). In contrast, the LD50 was 0.5% EMS in seeds presoaked for 6 h (T2) and seeds presoaked in water for 6 h then incubated at 28°C for 12 h before EMS treatment (T4). Five dwarf plants were observed in mutagenized seeds without presoaking as compared to control seeds (at the maturity stage of the control plant).

  3. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  4. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  5. Generating a learning curve for penile block in neonates, infants and children: an empirical evaluation of technical skills in novice and experienced anaesthetists.

    Science.gov (United States)

    Schuepfer, Guido; Jöhr, Martin

    2004-07-01

    Literature concerning learning curves for anaesthesiological procedures in paediatric anaesthesia is rare. The aim of this study was to assess the number of penile blocks needed to guarantee a high success rate in children. At a teaching hospital, the technical skills of 29 residents in anaesthesiology who performed penile blocks under the supervision of two staff anaesthesiologists were evaluated during a 12-month period using a standardized self-evaluation questionnaire. At the start of the study period, the residents had no prior experience in paediatric anaesthesia or in performing penile block. All residents entered the paediatric rotation after a minimum of 1-year training in adult general and regional anaesthesia. The blocks were rated using a binary score. For comparison, the success rates of the two supervising staff anaesthesiologists were collected during the same period using the same self-evaluation questionnaire. Statistical analyses were performed by generating individual and institutional learning curves by using the pooled data. The learning curves were calculated with the aid of a least square fit model. A 95% CI were estimated by a Monte Carlo procedure with a bootstrap technique. In a total number of 392 blocks performed, the overall success rate was 92.1%. There was no statistical difference between the success rate of the two staff members (success rate: 96.3%) and the overall success rate of the 29 residents performing a total of 339 blocks. The total success rate for this group was 91.5%. The failure rate for the first 10 blocks performed by the residents was 8.82% (95% CI: 5.0-14.14%), it was 4.12% (95% CI: 1.13-10.22%) for the next 10 blocks and from blocks 21 to 40 it was 6.5% (95% CI: 2.65-12.9%). For blocks 41-60, the failure rate was 4.4% (95% CI 0.54-15.15%). Penile block in children is easily learned by residents. A steep learning curve was found. The success rate was over 93.5% after more than 40 blocks. Copyright 2004 Blackwell

  6. Buckling of a Longitudinally Jointed Curved Composite Panel Arc Segment for Next Generation of Composite Heavy Lift Launch Vehicles: Verification Testing Analysis

    Science.gov (United States)

    Farrokh, Babak; Segal, Kenneth N.; Akkerman, Michael; Glenn, Ronald L.; Rodini, Benjamin T.; Fan, Wei-Ming; Kellas, Sortiris; Pineda, Evan J.

    2014-01-01

    In this work, an all-bonded out-of-autoclave (OoA) curved longitudinal composite joint concept, intended for use in the next generation of composite heavy lift launch vehicles, was evaluated and verified through finite element (FE) analysis, fabrication, testing, and post-test inspection. The joint was used to connect two curved, segmented, honeycomb sandwich panels representative of a Space Launch System (SLS) fairing design. The overall size of the resultant panel was 1.37 m by 0.74 m (54 in by 29 in), of which the joint comprised a 10.2 cm (4 in) wide longitudinal strip at the center. NASTRAN and ABAQUS were used to perform linear and non-linear analyses of the buckling and strength performance of the jointed panel. Geometric non-uniformities (i.e., surface contour imperfections) were measured and incorporated into the FE model and analysis. In addition, a sensitivity study of the specimens end condition showed that bonding face-sheet doublers to the panel's end, coupled with some stress relief features at corner-edges, can significantly reduce the stress concentrations near the load application points. Ultimately, the jointed panel was subjected to a compressive load. Load application was interrupted at the onset of buckling (at 356 kN 80 kips). A post-test non-destructive evaluation (NDE) showed that, as designed, buckling occurred without introducing any damage into the panel or the joint. The jointed panel was further capable of tolerating an impact damage to the same buckling load with no evidence of damage propagation. The OoA cured all-composite joint shows promise as a low mass factory joint for segmented barrels.

  7. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    Science.gov (United States)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  8. Numerical computation of fragility curves for NPP equipment

    Energy Technology Data Exchange (ETDEWEB)

    Zentner, I., E-mail: irmela.zentner@edf.f [LaMSID, Laboratory for the Mechanics of Aging Industrial Structures, UMR EDF/CNRS, 1, av. du General de Gaulle, 92141 Clamart (France)

    2010-06-15

    The seismic probabilistic risk assessment (PRA) methodology is a popular approach for evaluating the risk of failure of engineering structures due to earthquake. In this framework, fragility curves express the conditional probability of failure of a structure or component for a given seismic input motion parameter A, such as peak ground acceleration (PGA) or spectral acceleration. The failure probability due to a seismic event is obtained by convolution of fragility curves with seismic hazard curves. In general, a log-normal model is used in order to estimate fragilities. In nuclear engineering practice, these fragilities are determined using safety factors with respect to design earthquake. This approach allows to determine fragility curves based on design study but largely draws on expert judgement and simplifying assumptions. When a more realistic assessment of seismic fragility is needed, simulation-based statistical estimation of fragility curves is more appropriate. In this paper, we will discuss statistical estimation of parameters of fragility curves and present results obtained for a reactor coolant system of nuclear power plant. We have performed non-linear dynamic response analyses using artificially generated strong motion time histories. Uncertainties due to seismic loads as well as model uncertainties are taken into account and propagated using Monte Carlo simulation.

  9. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  10. A Novel Numerical Algorithm for Optimal Sizing of a Photovoltaic/Wind/Diesel Generator/Battery Microgrid Using Loss of Load Probability Index

    Directory of Open Access Journals (Sweden)

    Hussein A. Kazem

    2013-01-01

    Full Text Available This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the results showed that the optimum sizing ratios (the daily energy generated by the source to the daily energy demand for the PV array, wind turbine, diesel generator, and battery for a system located in Sohar, Oman, are 0.737, 0.46, 0.22, and 0.17, respectively. A case study represented by a system consisting of 30 kWp PV array (36%, 18 kWp wind farm (55%, and 5 kVA diesel generator (9% is presented. This system is supposed to power a 200 kWh/day load demand. It is found that the generated energy share of the PV array, wind farm, and diesel generator is 36%, 55%, and 9%, respectively, while the cost of energy is 0.17 USD/kWh.

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Directory of Open Access Journals (Sweden)

    Gian Paolo Beretta

    2008-08-01

    Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  13. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Science.gov (United States)

    Beretta, Gian P.

    2008-09-01

    A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  14. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  15. A Novel Numerical Algorithm for Optimal Sizing of a Photovoltaic/Wind/Diesel Generator/Battery Microgrid Using Loss of Load Probability Index

    OpenAIRE

    Hussein A. Kazem; Tamer Khatib

    2013-01-01

    This paper presents a method for determining optimal sizes of PV array, wind turbine, diesel generator, and storage battery installed in a building integrated system. The objective of the proposed optimization is to design the system that can supply a building load demand at minimum cost and maximum availability. The mathematical models for the system components as well as meteorological variables such as solar energy, temperature, and wind speed are employed for this purpose. Moreover, the r...

  16. Detonation probabilities of high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  17. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  18. Generation of a Parabolic Trough Collector Efficiency Curve from Separate Measurements of Outdoor Optical Efficiency and Indoor Receiver Heat Loss: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Kutscher, C.; Burkholder, F.; Stynes, K.

    2010-10-01

    The overall efficiency of a parabolic trough collector is a function of both the fraction of direct normal radiation absorbed by the receiver (the optical efficiency) and the heat lost to the environment when the receiver is at operating temperature. The overall efficiency can be determined by testing the collector under actual operating conditions or by separately measuring these two components. This paper describes how outdoor measurement of the optical efficiency is combined with laboratory measurements of receiver heat loss to obtain an overall efficiency curve. Further, it presents a new way to plot efficiency that is more robust over a range of receiver operating temperatures.

  19. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  20. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  5. Automated Blazar Light Curves Using Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Spencer James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-27

    This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.

  6. Wavelet subdivision methods gems for rendering curves and surfaces

    CERN Document Server

    Chui, Charles

    2010-01-01

    OVERVIEW Curve representation and drawing Free-form parametric curves From subdivision to basis functions Wavelet subdivision and editing Surface subdivision BASIS FUNCTIONS FOR CURVE REPRESENTATION Refinability and scaling functions Generation of smooth basis functions Cardinal B-splines Stable bases for integer-shift spaces Splines and polynomial reproduction CURVE SUBDIVISION SCHEMES Subdivision matrices and stencils B-spline subdivision schemes Closed curve rendering Open curve rendering BASIS FUNCTIONS GENERATED BY SUBDIVISION MATRICES Subdivision operators The up-sampling convolution ope

  7. Koch Curves: Rewriting System, Geometry and Application

    Directory of Open Access Journals (Sweden)

    Mamta Rani

    2011-01-01

    Full Text Available Problem statement: Recently, new Koch curves have been generated by dividing the initiator into three unequal parts. There is no formal rewriting system to generate such kind of curves. Approach: It is required to measure the new changed geometrical properties. Generalized rewriting systems for the new Koch curves have been developed. Results: New formulas have been given to measure their geometrical properties. Conclusion/Recommendations: The geometrical properties of new Koch curves make them more suitable as antennas in wireless communication than the conventional Koch curve.

  8. A Computer Model to Predicating a Recession Curve Discharges for the Inflow Hydrograph to Dokan and Derbendikan Lakes

    Directory of Open Access Journals (Sweden)

    Anas M. Mohamod

    2013-05-01

    Full Text Available The aim of this research is invention probabilistic approach to analyze the recession curve for mean monthly inflows to lakes of Dokan dam at Lesser Zab river and Derbendikan dam  at the Diyalah river by dividing the recession curve to many class interval and finding the mathematical equations which controls each class interval by using finite mathematics, and using Markov chain to calculated the transition probability matrix for the classes intervals. A computer program in Visual basic language with visual application interface in Excel software which be developed to generate recession curve. The results indicated that recession curve consist of five class intervals and each class interval have a power regression equation, the statistical analysis indicated a good confidence to use a computer program to generate the recession curve because that average of relative percentage error wasn’t greater than     8 %  for Dokan dam and 9% for Derbendikan dam.

  9. Curved PVDF airborne transducer.

    Science.gov (United States)

    Wang, H; Toda, M

    1999-01-01

    In the application of airborne ultrasonic ranging measurement, a partially cylindrical (curved) PVDF transducer can effectively couple ultrasound into the air and generate strong sound pressure. Because of its geometrical features, the ultrasound beam angles of a curved PVDF transducer can be unsymmetrical (i.e., broad horizontally and narrow vertically). This feature is desired in some applications. In this work, a curved PVDF air transducer is investigated both theoretically and experimentally. Two resonances were observed in this transducer. They are length extensional mode and flexural bending mode. Surface vibration profiles of these two modes were measured by a laser vibrometer. It was found from the experiment that the surface vibration was not uniform along the curvature direction for both vibration modes. Theoretical calculations based on a model developed in this work confirmed the experimental results. Two displacement peaks were found in the piezoelectric active direction of PVDF film for the length extensional mode; three peaks were found for the flexural bending mode. The observed peak positions were in good agreement with the calculation results. Transient surface displacement measurements revealed that vibration peaks were in phase for the length extensional mode and out of phase for the flexural bending mode. Therefore, the length extensional mode can generate a stronger ultrasound wave than the flexural bending mode. The resonance frequencies and vibration amplitudes of the two modes strongly depend on the structure parameters as well as the material properties. For the transducer design, the theoretical model developed in this work can be used to optimize the ultrasound performance.

  10. Probability and Relative Frequency

    Science.gov (United States)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  11. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  12. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  13. Tautological Integrals on Symmetric Products of Curves

    Institute of Scientific and Technical Information of China (English)

    Zhi Lan WANG

    2016-01-01

    We propose a conjecture on the generating series of Chern numbers of tautological bundles on symmetric products of curves and establish the rank 1 and rank −1 case of this conjecture. Thus we compute explicitly the generating series of integrals of Segre classes of tautological bundles of line bundles on curves, which has a similar structure as Lehn’s conjecture for surfaces.

  14. Generations.

    Science.gov (United States)

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession.

  15. Linear Systems on Tropical Curves

    CERN Document Server

    Haase, Christian; Yu, Josephine

    2009-01-01

    A tropical curve \\Gamma is a metric graph with possibly unbounded edges, and tropical rational functions are continuous piecewise linear functions with integer slopes. We define the complete linear system |D| of a divisor D on a tropical curve \\Gamma analogously to the classical counterpart. We investigate the structure of |D| as a cell complex and show that linear systems are quotients of tropical modules, finitely generated by vertices of the cell complex. Using a finite set of generators, |D| defines a map from \\Gamma to a tropical projective space, and the image can be extended to a tropical curve of degree equal to \\deg(D). The tropical convex hull of the image realizes the linear system |D| as a polyhedral complex. We show that curves for which the canonical divisor is not very ample are hyperelliptic. We also show that the Picard group of a \\Q-tropical curve is a direct limit of critical groups of finite graphs converging to the curve.

  16. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  17. Normal origamis of Mumford curves

    CERN Document Server

    Kremer, Karsten

    2010-01-01

    An origami (also known as square-tiled surface) is a Riemann surface covering a torus with at most one branch point. Lifting two generators of the fundamental group of the punctured torus decomposes the surface into finitely many unit squares. By varying the complex structure of the torus one obtains easily accessible examples of Teichm\\"uller curves in the moduli space of Riemann surfaces. The p-adic analogues of Riemann surfaces are Mumford curves. A p-adic origami is defined as a covering of Mumford curves with at most one branch point, where the bottom curve has genus one. A classification of all normal non-trivial p-adic origamis is presented and used to calculate some invariants. These can be used to describe p-adic origamis in terms of glueing squares.

  18. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  19. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  20. 点集局部处处为凸的外包线自动生成%Algorithm for automatically generating a locally convex envelope curve of a point set

    Institute of Scientific and Technical Information of China (English)

    李世森; 李春阳

    2012-01-01

    在海岸工程的数学模型中,原始地形数据一般表现为一系列平面点的坐标,而在数学建模过程中往往需要根据该点集(坐标)人工给定模拟区域的边界.可根据事先设定的搜索点数得到点集的外包线,不同的搜索点数可以得到不同的外包线.一般说来,随着搜索点数的增加,外包线内的面积也不断增大,直到得到该区域的凸包(该凸包一般不是所要寻找的).外包线内的面积与凸包的面积比值,定义为该外包线的凸度.为了减少手工工作的劳动量,提出了一个根据给定点集,自动寻找其合适外包线的算法.同时给出了外包线的调整算法,使得寻找到的外包线更加贴近初始给定的情形.最后应用该程序对渤海区域边界点数据进行了边界寻找,效果良好.%In coastal engineering mathematical model,the original terrain data are a series of planar point data with its elevations.In the mathematical model, it is required to get the artificial boundary of the given model region from the points set.An algorithm can get the envelope curve based on pre-set number of searching points, and different number can get different envelope curve.With the number of search points increasing, the area bounded by the envelope curve is also increasing, until the convex hull of the region is got (in general, the convex hull is not what we are considering).The convex-concave degree of envelope curve is defined as the ratio of the area bounded by the envelope curve and the area bounded by the convex hull.In order to reduce the amount of manual labor work,an algorithm which could automatically generate the appropriate envelope curve of the given points set was presented in this paper.An adjustment algorithm which could make the envelope curve more coincidence of the actual situation was also given.Finally, this algorithm was used to generate the Bohai Sea region boundary .The computed boundary agrees reasonable well with the actual

  1. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  2. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  3. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  4. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  5. Multiphasic growth curve analysis.

    NARCIS (Netherlands)

    Koops, W.J.

    1986-01-01

    Application of a multiphasic growth curve is demonstrated with 4 data sets, adopted from literature. The growth curve used is a summation of n logistic growth functions. Human height growth curves of this type are known as "double logistic" (n = 2) and "triple logistic" (n = 3) growth curves (Bock

  6. Transient stability probability evaluation of power system incorporating with wind farm and SMES

    DEFF Research Database (Denmark)

    Fang, Jiakun; Miao, Lu; Wen, Jinyu

    2013-01-01

    Large scale renewable power generation brings great challenges to the power system operation and stabilization. Energy storage is one of the most important technologies to face the challenges. This paper proposes a method for transient stability probability evaluation of power system with wind farm......, together with the cost function, the coil size is optimized economically....... the probability indices. With the proposed method based on Monte-Carlo simulation and bisection method, system stability is "measured". Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve...

  7. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  8. 基于学习曲线的中国光伏发电成本发展趋势分析%Analysis of Cost Development Trend of Photovoltaic Power Generation in China Based on Learning Curve

    Institute of Scientific and Technical Information of China (English)

    隋礼辉

    2012-01-01

    This paper presents the development trend of photovoltaic power generation in China. Based on the learning curve model, constant learning rates and stage learning rates are adopted to analyze the development trend of photovoltaic power generation unit cost. And the learning cost of PV power generation achieved commercial operation is estimated. Then it predicts that the cost of PV power generation in 2020 is expected to match the traditional fossil fuels and will be realized the large-scale commercial operation stage.%介绍了我国光伏发电的发展情况,以学习曲线模型为原理,分别采用恒定学习率与分阶段学习率分析了未来我国10年光伏发电单位成本发展趋势,并估算了光伏发电实现商业化运作所需的学习成本,预测在2020年左右我国光伏发电成本有望与传统化石能源相持平,进而步入大规模商业化运作阶段.

  9. Measurement of growth curve in F1 generation of Rongshui miniature pig%融水小型猪 F1代生长研究

    Institute of Scientific and Technical Information of China (English)

    施赫赫; 陈淦; 刘运忠; 刘科; 邝少松; 任海涛; 余细勇; 唐小江

    2015-01-01

    目的:测定融水小型猪F1代体重和体尺。方法选取F1代融水小型猪83头(雌性48头,雄性35头),测定初生至12月龄的体重、体长、体高、胸围、胸宽、胸深、管围、腿围、嘴裂长度共9个生长发育指标,并应用SPSS统计软件和Logistic非线性生长模型进行分析。结果融水小型猪F1代的初生体重雌雄分别为0.61±0.14 kg和0.55±0.13 kg,6月龄体重雌雄分别为17.21±5.20 kg和16.35±5.23 kg,12月龄体重雌雄分别为26.97±6.49 kg和26.53±5.65 kg。雌雄比较,9项指标所测结果接近,除了初生体重和体长、10月龄胸宽有差异( P <0.05),其余指标同月龄雌雄之间均无明显差异。应用Logistic模型分析,体重生长拐点在5~6月龄间,体长和腿围生长拐点在2~3月龄间,体高、胸围、胸宽、胸深、管围和嘴裂长度的生长拐点在1~2月龄间。结论融水小型猪F1代成年体重轻,性情温顺,具备培养成实验用小型猪基本条件。%Objective To measure the body weight and body size of the F1 generation in Rongshui miniature pig ( RMP ).Methods 83 F1 generations of RMPs (48 females and 35 males) were selected randomly.9 traits included body-weight, body-length, body-height, chest-circumference, chest-breadth, chest-depth, circum of pastern, girth of leg and rictus were measured, and analyzed statistically by SPSS statistical software and Logistic nonlinear growth analysis model.Results In the F1 generations of RMP, the weights of birth day、6thmonth and 12th month of female and male were 0.61 ±0.14 kg and 0.55 ±0.13 kg, 17.21 ±5.20 kg and 16.35 ±5.23 kg, 26.97 ±6.49 kg and 26.53 ±5.65 kg respectively.There was no difference significantly between the genders of the 9 measured traits except for born-weight, born-length and chest-breadth in 10th month ( P <0.05 ).According to the analysis in Logistic model, body-weight inflection point was between 5th -6th month, body length

  10. 基于概率模型的ATC系统冲突目标生成算法%Probability-Based Method of Generating the Conflict Trajectories for ATC System

    Institute of Scientific and Technical Information of China (English)

    苏志刚; 眭聪聪; 吴仁彪

    2011-01-01

    For testing the capability of short term conflict alerting of air traffic control system, two methods are usually used. The former is to set a higher threshold, use the real data testing whether the system can alert when distance between two flights gets lower than the threshold. However, this method is not reliable. The second method is simulating flights which will conflict and obtain their trajectory from calculating, and then send these data to ATC system to see its reaction. This method is usually too simple to test whether the system can pre-detect a conflict effectively. To solve these problems, a probabilistic approach is used in this paper to simulate air-crafts with given probability of conflicting. Firstly, we derived the conflict probability of turing flights from Prandaini' s method of conflict probability estimation for linear flight. Then using reverse derivation we got the motion parameters of two targets whose conflict probability was pre-setted. At last, we simulated this pair of targets' track and anlysised their conflict probability. The simulation results show that the targets' probability of conflict was in line with the previous assumption. The trajectories generated by this algorithm are more realistic then a more effective conclusion of ATC system' s capability of short term conflict alerting and pre-detecting will be provided.%通常用于测试空中交通管制(Air Traffic Control,ATC)自动化系统的飞行冲突告警功能的方法主要有放宽系统告警值和向系统输入模拟的飞行冲突目标的雷达数据.前一种方法存在不可靠性,第二种方法由于只产生简单的确定目标轨迹数据,因此只能简单地测试系统能否告警,无法对系统的飞行冲突预测能力作出评价.为了使用于测试系统的模拟雷达数据更符合实际飞行情况,并检测系统预测飞行冲突的技术水平,本文提出了一种基于飞行冲突概率模型的航迹模拟方法,通过对不同目标

  11. AHT Bézier Curves and NUAHT B-Spline Curves

    Institute of Scientific and Technical Information of China (English)

    Gang Xu; Guo-Zhao Wang

    2007-01-01

    In this paper, we present two new unified mathematics models of conics and polynomial curves, called algebraic hyperbolic trigonometric ( AHT) Bézier curves and non-uniform algebraic hyperbolic trigonometric ( NUAHT) B-sptine curves of order n, which are generated over the space span{sin t, cos t, sinh t, cosh t, 1, t,..., tn-5}, n ≥ 5. The two kinds of curves share most of the properties as those of the Bézier curves and B-spline curves in polynomial space. In particular, they can represent exactly some remarkable transcendental curves such as the helix, the cycloid and the catenary. The subdivision formulae of these new kinds of curves are also given. The generations of the tensor product surfaces are straightforward. Using the new mathematics models, we present the control mesh representations of two classes of minimal surfaces.

  12. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  13. Empirical and Computational Tsunami Probability

    Science.gov (United States)

    Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.

    2008-12-01

    sources of epistemic uncertainty in the computational analysis are the overall rate of occurrence and the inter-event distribution for landslide sources. From both empirical and computational analyses, tsunami probability as a function of runup (i.e., the tsunami hazard curve) in seismically active ocean basins such as the Pacific can be described by a modified power-law, which is similar to size distributions for other natural hazards (earthquakes, landslides, etc.). At present, it is unclear whether this form of the tsunami hazard curve is of the same form in ocean basins that have a much lower rate of tsunami occurrence (e.g, the Atlantic).

  14. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  15. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  16. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  17. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  18. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  19. Feasible Tra jectory Generation for Autonomous Vehicles Based on Quartic B´ezier Curve%基于四阶贝塞尔曲线的无人车可行轨迹规划

    Institute of Scientific and Technical Information of China (English)

    陈成; 何玉庆; 卜春光; 韩建达

    2015-01-01

    对于实际的无人车系统来说,轨迹规划需要保证其规划出来的轨迹满足运动学约束、侧滑约束以及执行机构约束。为了生成满足无人车初始状态约束、目标状态约束的局部可行轨迹,本文提出了一种基于四阶贝塞尔曲线的轨迹规划方法。在该方法中,轨迹规划问题首先被分解为轨形规划及速度规划两个子问题。为了满足运动学约束、初始状态约束、目标状态约束以及曲率连续约束,本文采用由3个参数确定的四阶贝塞尔曲线来规划轨迹形状。为了保证转向机构可行,本文进一步采用优化方法求解一组最优参数从而规划出曲率变化最小的轨线。对于轨线执行速度规划,为了满足速度连续约束、加速度连续约束、加速度有界约束以及目标状态侧滑约束,本文首先求解了可行的轨迹执行耗时区间,再进一步在该区间中求解能够保证任意轨迹点满足侧滑约束的耗时,最后再由该耗时对任意点速度进行规划。本文结合实际无人车的应用对轨迹搜索空间生成、道路行车模拟以及路径跟踪进行了仿真实验,并基于实际的环境数据进行了轨迹规划实验。%For practical autonomous vehicles, the generated trajectories should ensure the feasibility imposed by kine-matic, dynamic and actuation. To generate a locally feasible trajectory from the initial state to the target state, a trajectory generation algorithm based on quartic B´ezier curve is proposed. Firstly, the original problem is decomposed into shaping the trajectory and executing the shape. To satisfy the kinematic constraints, initial state and target state constraints and continuous curvature constraint, a quartic B´ezier curve defined by 3 parameters is adopted to shape the tra jectory. To further ensure the feasibility of steering, optimization is utilized to resolve a set of parameters to generate a trajectory that has a

  20. Research on Variational Curve Model and Its Application in Track Database Generation%变分曲线模型及其在轨道数据库生成中的应用研究

    Institute of Scientific and Technical Information of China (English)

    上官伟; 蔡伯根; 王剑; 陈德旺

    2011-01-01

    The variational curve model under finite constrained conditions and its application in the railway train control track database were studied.The method, model and algorithm of digital track map auto-generation from GPS track data was raised.The algorithm and method are verified and improved by a large number of measured data.The composite data reduction model was established, on the basis of large quantities of track data.The actual finite constrained conditions in train operation were analyzed.The curve evolution theory was studied.Data reduction and error deletion were accomplised by use of the algorithm based on the variational curve model evolution theory.A small amount of key data were obtained rapidly and automatically to describe track paths under the premise that certain stipulations of errors were satisfied.The automatic generation and optimization method of the train control track database was improved to realize real-time and precise positioning of the train operation control system based on GPS.%本文研究有限约束条件下的变分曲线模型及其在铁路列控轨道数据库中的应用,提出从GPS轨道数据中自动生成适合列车控制的数字轨道地图的方法、模型和算法.通过大量实测数据对算法和方法进行验证、比较和改进,针对大量轨道数据建立其数据约简的组合数学模型,分析实际列车运行时有限约束条件,研究曲线演化理论,采用基于变分曲线模型曲线形变的思想对数据进行约简和误差剔除,在满足一定的误差前提下快速自动获取少量关键数据来描述铁道轨迹,完善列控轨道数据库的自动生成与优化方法,实现基于GPS的列车运行控制系统的实时、精确定位.

  1. Spinal curves (image)

    Science.gov (United States)

    There are four natural curves in the spinal column. The cervical, thoracic, lumbar, and sacral curvature. The curves, along with the intervertebral disks, help to absorb and distribute stresses that occur from everyday activities such as walking or from ...

  2. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  3. Parametrizing Algebraic Curves

    OpenAIRE

    Lemmermeyer, Franz

    2011-01-01

    We present the technique of parametrization of plane algebraic curves from a number theorist's point of view and present Kapferer's simple and beautiful (but little known) proof that nonsingular curves of degree > 2 cannot be parametrized by rational functions.

  4. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  5. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  6. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  7. Directed Design of Experiments for Validating Probability of Detection Capability of Nde Systems (doepod)

    Science.gov (United States)

    Generazio, E. R.

    2008-02-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. Specifically, DOEPOD demands utilization of observance of occurrences. Directed DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required.

  8. APLICACIÓN DE SALES DE TETRAZOLIO DE NUEVA GENERACIÓN (XTT PARA LA ESTIMACIÓN DE LA DENSIDAD DE MICROORGANISMOS DEGRADADORES DE HIDROCARBUROS EMPLEANDO LA TÉCNICA DEL NÚMERO MÁS PROBABLE Application of the New Generation Tetrazolium Salt (XTT for the Enumeration of Hydrocarbon Degrading Microorganisms Using the Most Probable Number Method

    Directory of Open Access Journals (Sweden)

    VICTORIA EUGENIA VALLEJO

    Full Text Available El presente estudio evaluó el desempeño de dos sales de tetrazolio, una tradicional: INT y una de nueva generación: XTT, para estimar la densidad de microorganismos degradadores de hidrocarburos (HCs en suelos empleando la técnica del Número Más Probable (NMP. Se analizaron 96 muestras de suelo provenientes de la Ecorregión Cafetera de Colombia. Los microorganismos fueron recuperados en agar mínimo de sales en atmósfera saturada de HCs y la capacidad degradadora fue confirmada por repiques sucesivos utilizando diesel como fuente de carbono. No se observaron diferencias significativas en los recuentos de microorganismos degradadores obtenidos con las dos sales (t de Student, p The objective of this study was to evaluate the performance of two tetrazolium indicators: a traditional one: INT and a new generation one: XTT, for the estimation of hydrocarbon (HC degrading microorganism s density using the Most Probable Number Technique (MPN. Ninety six composite soil samples were taken and analyzed from Ecorregión Cafetera Colombiana. Degrading microorganisms were recovered in minimum salt medium with saturated HC atmosphere. Degrading HC capacity of the microorganisms was confirmed by successive subcultures in the same medium using diesel as only carbon source. Counts obtained with the two salts were not significantly different (Student t test, p < 0,05 but XTT allowed an easier visualization of positive wells due to product solubility of the reduce product. A greater percentage of isolates was obtained using XTT (67%, which suggests that salt type is relevant for recovering of these microorganisms. Additionally, cell detection limit, optimal conditions of XTT concentration and incubation times for detection of activity were evaluated. This evaluation was performed by means of microplate format for hydrocarbon degrading microorganisms using Acinetobacter sp. An inhibitory effect was observed in the recovering of cultivable cells when XTT

  9. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja;

    2013-01-01

    This paper introduces EECM-MPFQ, a fast implementation of the elliptic-curve method of factoring integers. EECM-MPFQ uses fewer modular multiplications than the well-known GMP-ECM software, takes less time than GMP-ECM, and finds more primes than GMP-ECM. The main improvements above the modular......-arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  10. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  11. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. Remote sensing used for power curves

    Science.gov (United States)

    Wagner, R.; Jørgensen, H. E.; Paulsen, U. S.; Larsen, T. J.; Antoniou, I.; Thesbjerg, L.

    2008-05-01

    : Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviation in the power curve significantly. Two LiDARs and a SoDAR are used to measure the wind profile in front of a wind turbine. These profiles are used to calculate the equivalent wind speed. The comparison of the power curves obtained with the three instruments to the traditional power curve, obtained using a cup anemometer measurement, confirms the results obtained from the simulations. Using LiDAR profiles reduces the error in power curve measurement, when these are used as relative instrument together with a cup anemometer. Results from the SoDAR do not show such promising results, probably because of noisy measurements resulting in distorted profiles.

  13. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  14. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  15. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  16. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  17. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  18. THE NUCLEAR ENCOUNTER PROBABILITY

    NARCIS (Netherlands)

    SMULDERS, PJM

    1994-01-01

    This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.

  19. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  20. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  1. The arctic curve of the domain-wall six-vertex model

    CERN Document Server

    Colomo, F

    2009-01-01

    The problem of the form of the `arctic' curve of the six-vertex model with domain wall boundary conditions in its disordered regime is addressed. It is well-known that in the scaling limit the model exhibits phase-separation, with regions of order and disorder sharply separated by a smooth curve, called the arctic curve. To find this curve, we study a multiple integral representation for the emptiness formation probability, a correlation function devised to detect spatial transition from order to disorder. We conjecture that the arctic curve, for arbitrary choice of the vertex weights, can be characterized by the condition of condensation of almost all roots of the corresponding saddle-point equations at the same, known, value. In explicit calculations we restrict to the disordered regime for which we have been able to compute the scaling limit of certain generating function entering the saddle-point equations. The arctic curve is obtained in parametric form and appears to be a non-algebraic curve in general;...

  2. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  3. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  5. HIV感染者口腔白假丝酵母菌生长曲线及代时分析%Analysis of Growth Curve and Generation Time of Oral Candida albicans from HIV-infected Individuals

    Institute of Scientific and Technical Information of China (English)

    王晓丽; 王聪; 王涛; 刘奇; 郭利军; 白丽

    2014-01-01

    目的:了解HIV感染者口腔白假丝酵母菌的群体生长规律,为口腔念珠菌病的发病机理研究、诊断及防治提供基础。方法:将108株经两次激活的分离自HIV感染者和健康人群口腔的白假丝酵母菌接种于YPD液体培养基37℃培养15 h,每间隔1 h取样采用血球计数板活菌计数法计算活菌数,同时用酶标仪测定OD600值,绘制生长曲线并计算代时。结果:绘制了分离自HIV感染者和健康人口腔的白假丝酵母菌的生长曲线,0~3 h为迟缓期,4~10 h为对数生长期,稳定期开始于10 h,两组白假丝酵母菌的生长曲线基本相似。108株白假丝酵母菌的代时为1.568 h,其中HIV感染者白假丝酵母菌代时为1.354 h,健康人群白假丝酵母菌的代时为1.782 h,HIV感染者来源的白假丝酵母菌生长速度比健康人群来源的白假丝酵母菌快0.428 h,但差异无统计学意义。结论:HIV感染者和健康人群口腔白假丝酵母菌的生长曲线各个时期基本一致,HIV感染者口腔白假丝酵母菌生长速率较健康人群稍快,在口腔念珠菌病致病中的作用需要进一步研究。%Objective:To realize the growth rule of oral Candida albicans from HIV-infected individuals so as to provide a theory basis for the mechanism research of oral Candidasis,diagnosis,prevention and treatment of the disease. Methods:108 of activated oral Candida albicans from HIV-infected individuals and healthy population were grown at 37 ℃ for 15 hours in YPD liquid medium,measuring the number of living bacterium cells and the OD600 at 1 h intervals. Then the growth curve was made and the generation time was calculated. Results:The growth curves of the isolates showed that 0-3 h was the lag phase,and 4-10 h was the exponential phase. After 10 hour of culture the isolates' growth entered the stationary phase. Both groups of isolates showed similar growth curves in YPD medium. The generation time of

  6. Study on the Cost of Solar Photovoltaic Power Generation Using Double-factors Learning Curve Model%太阳能光伏发电成本的双因素学习曲线模型研究

    Institute of Scientific and Technical Information of China (English)

    曾鸣; 鹿伟; 段金辉; 李娜

    2012-01-01

    随着太阳能这一新能源的开发,在低碳经济时代,降低光伏发电成本以提高其可利用性显得尤为重要.在Wright基本学习曲线模型的基础上,建立太阳能光伏发电的双因素学习曲线模型,研究了太阳能光伏组件累积生产量和累积研发量对太阳能光伏发电成本的影响.采用2001年至2010年10a的数据,运用最小二乘法,检验参数的显著性,进而确定模型的可行性.并利用此双因素学习曲线模型对我国未来10a的太阳能光伏发电成本做了预测,得出在累积生产量和累积研发量发展情况不同的情形下,光伏发电成本的降低程度不同.表明应大力开发太阳能光伏发电产业,平衡累积生产量和累积研发量,使太阳能光伏发电的成本降低.同时对如何降低光伏发电成本提出了建议,为政策制定者提供一定的参考.%Along with the development of solar energy, it is particularly important to reduce the cost of solar photovoltaic power generation to improve its availability for low carbon economic. The double- factors learning curve model of solar photovoltaic power generation is built based on the Wright's basic learning curve model, and the influence of the accumulated production amount and the accumulated research and development amount of solar photovoltaic modules on the cost of solar photovoltaic power generation is studied. The ten- year data between 2001 and 2010 are used to test the significance of the parameters and to confirm the feasibility of the model by using of the least square method. In the end, the cost of solar photovoltaic power generation in future ten years is predicted, and the conclusion is that the cost of solar photovoltaic power generation reduces in different degree under different development situation. At the same time, proposals for reducing the cost are given, which provides certain reference for policy makers.

  7. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  8. Pencils on real curves

    CERN Document Server

    Coppens, Marc

    2011-01-01

    We consider coverings of real algebraic curves to real rational algebraic curves. We show the existence of such coverings having prescribed topological degree on the real locus. From those existence results we prove some results on Brill-Noether Theory for pencils on real curves. For coverings having topological degree 0 we introduce the covering number k and we prove the existence of coverings of degree 4 with prescribed covering number.

  9. An investigation of the ignition probability and data analysis for the detection of relevant parameters of mechanically generated steel sparks in explosive gas/air-mixtures; Untersuchungen zur Zuendwahrscheinlichkeit und Datenanalyse zur Erfassung der Einflussgroessen mechanisch erzeugter Stahl-Schlagfunktion in explosionsfaehigen Brenngas/Luft-Gemischen

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, Thomas; Finke, Robert; Graetz, Rainer

    2010-07-01

    Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.

  10. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  11. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  12. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G A

    2004-09-21

    implementations to compute the decision threshold r{sub 0}*that will provide the appropriate desired Probability of False Alarm P{sub FA} for the matched filter. The goal is to use prior knowledge of the background data to generate an estimate of the probability density function (pdf) [13] of the matched filter threshold r for the case in which the data measurement contains only background data (we call this case the null hypothesis, or H{sub 0}) [10, 11]. We call the pdf estimate {cflx f}(r|H{sub 0}). In this report, we use histograms and Parzen pdf estimators [14, 15, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]. Once the estimate is obtained, it can be integrated to compute an estimate of the P{sub FA}as a function of the matched filter detection threshold r. We can then interpolate r vs. P{sub FA} to obtain a curve that gives the threshold r{sub 0}* that will provide the appropriate desired Probability of False Alarm P{sub FA}for the matched filter. Processing results have been computed using both simulated and real LASI data sets. The algorithms and codes have been validated, and the results using LASI data are presented here. Future work includes applying the pdf estimation and CFAR threshold calculation algorithms to the LASI matched filter based upon global background statistics, and developing a new adaptive matched filter algorithm based upon local background statistics. Another goal is to implement the 4-Gamma pdf modeling method proposed by Stocker et. al. [4] and comparing results using histograms and the Parzen pdf estimators.

  13. Determination of the 121Te gamma emission probabilities associated with the production process of radiopharmaceutical NaI[123I

    Science.gov (United States)

    de Araújo, M. T. F.; Poledna, R.; Delgado, J. U.; de Almeida, M. C. M.; Lopes, R. T.; Silva, R. L.; Cagido, A. C. F.

    2016-07-01

    The 123I is widely used in radiodiagnostic procedures in nuclear medicine. According to Pharmacopoeia care should be taken during its production process, since radionuclidic impurities may be generated. The 121Te is an impurity that arises during the 123I production and determining their gamma emission probabilities (Pγ) is important in order to obtain more information about its decay. Activities were also obtained by absolute standardization using the sum-peak method and these values were compared to the efficiency curve method.

  14. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  15. Critical Factors for Inducing Curved Somatosensory Saccades

    Directory of Open Access Journals (Sweden)

    Tamami Nakano

    2011-10-01

    Full Text Available We are able to make a saccade toward a tactile stimuli to one hand, but trajectories of many saccades curved markedly when the arms were crossed (Groh & Sparks, 2006. However, it remains unknown why some curved and others did not. We therefore examined critical factors for inducing the curved somatosensory saccades. Participants made a saccade as soon as possible from a central fixation point toward a tactile stimulus delivered to one of the two hands, and switched between arms-crossed and arms-uncrossed postures every 6 trials. Trajectories were generally straight when the arms were uncrossed, but all participants made curved saccades when the arms were crossed (12–64%. We found that the probability of curved saccades depended critically on the onset latency: the probability was less than 5% when the latency was larger than 250 ms, but the probability increased up to 70–80% when the onset latency was 160 ms. This relationship was shared across participants. The results suggest that a touch in the arms-crossed posture was always mapped to the wrong hand in the initial phase up to 160 ms, and then remapped to the correct hand during the next 100 ms by some fundamental neural mechanisms shared across participants.

  16. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  17. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  18. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  19. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  20. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  1. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  2. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  3. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  4. Research of automatically generating the curved surface cards in MCNP input file%自动生成MCNP输入文件中曲面卡的研究与实现

    Institute of Scientific and Technical Information of China (English)

    黄少华; 杨平利; 袁媛; 林成地

    2013-01-01

    Since the geometry module in hand⁃written MCNP input file is easy to make mistakes,three development inter⁃faces(API function,C++ class and direct interface function)provided by Spatial Company’s ACIS were adopted to get the sur⁃face equation for all surfaces in the model according to the given CAD model. Especially when the uneven surface and coordi⁃nate axis line is not parallel,the simplified surface equation of the auxiliary coordinate system is used to automatically generate the curved surface card with MCNP format. Through validation of different models,this method can generate the surface card correctly and improve the efficiency of compiling the MCNP input file.%  针对手工编写MCNP输入文件中几何模块容易出错的问题,采用Spatial公司推出ACIS提供的API函数、C++类和DI函数3种开发接口,实现根据给定的CAD模型得到该模型中所有曲面的面方程,在曲面与坐标轴不平行时,以辅助坐标系的形式简化面方程,最终自动生成MCNP格式的曲面卡。通过对不同模型的验证,该方法可以正确生成曲面卡,能提高编写MCNP输入文件的效率。

  5. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  6. Probabilities of Natural Events Occurring at Savannah River Plant

    Energy Technology Data Exchange (ETDEWEB)

    Huang, J.C.

    2001-07-17

    This report documents the comprehensive evaluation of probability models of natural events which are applicable to Savannah River Plant. The probability curves selected for these natural events are recommended to be used by all SRP/SRL safety analysts. This will ensure a consistency in analysis methodology for postulated SAR incidents involving natural phenomena.

  7. Principal Curves on Riemannian Manifolds.

    Science.gov (United States)

    Hauberg, Soren

    2016-09-01

    Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.

  8. Tempo curves considered harmful

    NARCIS (Netherlands)

    Desain, P.; Honing, H.

    1993-01-01

    In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression th

  9. Pairings on hyperelliptic curves

    CERN Document Server

    Balakrishnan, Jennifer; Chisholm, Sarah; Eisentraeger, Kirsten; Stange, Katherine; Teske, Edlyn

    2009-01-01

    We assemble and reorganize the recent work in the area of hyperelliptic pairings: We survey the research on constructing hyperelliptic curves suitable for pairing-based cryptography. We also showcase the hyperelliptic pairings proposed to date, and develop a unifying framework. We discuss the techniques used to optimize the pairing computation on hyperelliptic curves, and present many directions for further research.

  10. Retrospectives: Engel Curves

    National Research Council Canada - National Science Library

    Andreas Chai; Alessio Moneta

    2010-01-01

    ..., Professor of Economics, University of Illinois, Chicago, at jpersky@uic.edu jpersky@uic.edu.. Introduction Introduction Engel curves describe how household expenditure on particular goods or Engel curves describe how household expenditure on particular goods or services depends on household income. The name comes from the German st...

  11. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  12. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  13. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  14. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  15. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  16. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  17. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  18. Parallel Parking Path Generation Based on Bezier Curve Fitting%基于Bezier曲线拟合的自主平行泊车轨迹模型仿真

    Institute of Scientific and Technical Information of China (English)

    刘钰; 马艳丽; 李涛

    2011-01-01

    The autonomous parking system is an intelligent tehnology to park a car into a small space. Based on Ackermann steering geometry, considering the parking practice, the minimum parking space and the bound of start point and collision-free space are obtained. Then based on Bezier curve. considering the dynamic constraint condition and the parametric equation, the dynamic relationship betwwen x and y ls eslablished. For each time point t, followed by a specific steering angle and turning radius, a new locaion can be obtained. So a car can be controlled to move along a Bezier curve, and finally the continuous path is generated. At last, the simulation based on Matlab shows that the car can not only avoid the obstaeles effectively, but also move smoothly on turning points. So the continuous auto-parking can be realized in a space as small as possible.%根据阿克曼转向几何学,联系实际泊车情况,首先给出最短泊车空间、泊车起始点范围以及避碰约束空间的生成方法,并在此基础上,考虑动力学约束条件,在直角坐标系Oxy平面中,利用参数化方程,基于Bezier曲线,建立x和y之间动态关系.对每个时间t,依次得到汽车泊车时的具体转向角度和转向半径,得到新的位置点,从而控制汽车沿着Bezier曲线随着时间做变速运动,最终实现连续曲率的轨迹规划.Matlab仿真表明,采用这种方法能够有效地实现避障,在曲线转折点处运动平滑,未出现抖动,满足自主泊车的连续性要求及所需泊车空间尽可能小的目的.

  19. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  20. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  1. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  2. Supersymmetric Spacetimes from Curved Superspace

    CERN Document Server

    Kuzenko, Sergei M

    2015-01-01

    We review the superspace technique to determine supersymmetric spacetimes in the framework of off-shell formulations for supergravity in diverse dimensions using the case of 3D N=2 supergravity theories as an illustrative example. This geometric formalism has several advantages over other approaches advocated in the last four years. Firstly, the infinitesimal isometry transformations of a given curved superspace form, by construction, a finite-dimensional Lie superalgebra, with its odd part corresponding to the rigid supersymmetry transformations. Secondly, the generalised Killing spinor equation, which must be obeyed by the supersymmetry parameters, is a consequence of the more fundamental superfield Killing equation. Thirdly, general rigid supersymmetric theories on a curved spacetime are readily constructed in superspace by making use of the known off-shell supergravity-matter couplings and restricting them to the background chosen. It is the superspace techniques which make it possible to generate arbitra...

  3. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  4. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  5. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  6. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  7. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  8. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  9. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  10. Searching with Probabilities

    Science.gov (United States)

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  11. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  12. Geometric and Meshing Properties of Conjugate Curves for Gear Transmission

    Directory of Open Access Journals (Sweden)

    Dong Liang

    2014-01-01

    Full Text Available Conjugate curves have been put forward previously by authors for gear transmission. Compared with traditional conjugate surfaces, the conjugate curves have more flexibility and diversity in aspects of gear design and generation. To further extend its application in power transmission, the geometric and meshing properties of conjugate curves are discussed in this paper. Firstly, general principle descriptions of conjugate curves for arbitrary axial position are introduced. Secondly, geometric analysis of conjugate curves is carried out based on differential geometry including tangent and normal in arbitrary contact direction, characteristic point, and curvature relationships. Then, meshing properties of conjugate curves are further revealed. According to a given plane or spatial curve, the uniqueness of conjugated curve under different contact angle conditions is discussed. Meshing commonality of conjugate curves is also demonstrated in terms of a class of spiral curves contacting in the given direction for various gear axes. Finally, a conclusive summary of this study is given.

  13. Simulation modeling of the probability of magmatic disruption of the potential Yucca Mountain Site

    Energy Technology Data Exchange (ETDEWEB)

    Crowe, B.M.; Perry, F.V.; Valentine, G.A. [Los Alamos National Lab., NM (United States); Wallmann, P.C.; Kossik, R. [Golder Associates, Inc., Redmond, WA (United States)

    1993-11-01

    The first phase of risk simulation modeling was completed for the probability of magmatic disruption of a potential repository at Yucca Mountain. E1, the recurrence rate of volcanic events, is modeled using bounds from active basaltic volcanic fields and midpoint estimates of E1. The cumulative probability curves for El are generated by simulation modeling using a form of a triangular distribution. The 50% estimates are about 5 to 8 {times} 10{sup 8} events yr{sup {minus}1}. The simulation modeling shows that the cumulative probability distribution for E1 is more sensitive to the probability bounds then the midpoint estimates. The E2 (disruption probability) is modeled through risk simulation using a normal distribution and midpoint estimates from multiple alternative stochastic and structural models. The 50% estimate of E2 is 4.3 {times} 10{sup {minus}3} The probability of magmatic disruption of the potential Yucca Mountain site is 2.5 {times} 10{sup {minus}8} yr{sup {minus}1}. This median estimate decreases to 9.6 {times} 10{sup {minus}9} yr{sup {minus}1} if E1 is modified for the structural models used to define E2. The Repository Integration Program was tested to compare releases of a simulated repository (without volcanic events) to releases from time histories which may include volcanic disruptive events. Results show that the performance modeling can be used for sensitivity studies of volcanic effects.

  14. SRHA calibration curve

    Data.gov (United States)

    U.S. Environmental Protection Agency — an UV calibration curve for SRHA quantitation. This dataset is associated with the following publication: Chang, X., and D. Bouchard. Surfactant-Wrapped Multiwalled...

  15. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  16. Large Curved Surface Measurement

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The measurement principle of large curved surface through theodolite industry survey system is introduced. Two methods are suggested with respect to the distribution range of curved surface error. The experiments show that the measurement precision can be up to 0.15mm with relative precision of 3×10-5. Finally, something needed paying attention to and the application aspects on theodolite industry survey system are given.

  17. Counting curves on surfaces

    OpenAIRE

    2015-01-01

    In this paper we consider an elementary, and largely unexplored, combinatorial problem in low-dimensional topology. Consider a real 2-dimensional compact surface $S$, and fix a number of points $F$ on its boundary. We ask: how many configurations of disjoint arcs are there on $S$ whose boundary is $F$? We find that this enumerative problem, counting curves on surfaces, has a rich structure. For instance, we show that the curve counts obey an effective recursion, in the general framework of to...

  18. Arithmetic of Shimura curves

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This is the note for a series of lectures that the author gave at the Centre de Recerca Matemtica (CRM), Bellaterra, Barcelona, Spain on October 19–24, 2009. The aim is to give a comprehensive description of some recent work of the author and his students on generalisations of the Gross-Zagier formula, Euler systems on Shimura curves, and rational points on elliptic curves.

  19. Highly curved microchannel plates

    Science.gov (United States)

    Siegmund, O. H. W.; Cully, S.; Warren, J.; Gaines, G. A.; Priedhorsky, W.; Bloch, J.

    1990-01-01

    Several spherically curved microchannel plate (MCP) stack configurations were studied as part of an ongoing astrophysical detector development program, and as part of the development of the ALEXIS satellite payload. MCP pairs with surface radii of curvature as small as 7 cm, and diameters up to 46 mm have been evaluated. The experiments show that the gain (greater than 1.5 x 10 exp 7) and background characteristics (about 0.5 events/sq cm per sec) of highly curved MCP stacks are in general equivalent to the performance achieved with flat MCP stacks of similar configuration. However, gain variations across the curved MCP's due to variations in the channel length to diameter ratio are observed. The overall pulse height distribution of a highly curved surface MCP stack (greater than 50 percent FWHM) is thus broader than its flat counterpart (less than 30 percent). Preconditioning of curved MCP stacks gives comparable results to flat MCP stacks, but it also decreases the overall gain variations. Flat fields of curved MCP stacks have the same general characteristics as flat MCP stacks.

  20. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  1. Probability of detection models for eddy current NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Rajesh, S.N.

    1993-04-30

    The development of probability of detection (POD) models for a variety of nondestructive evaluation (NDE) methods is motivated by a desire to quantify the variability introduced during the process of testing. Sources of variability involved in eddy current methods of NDE include those caused by variations in liftoff, material properties, probe canting angle, scan format, surface roughness and measurement noise. This thesis presents a comprehensive POD model for eddy current NDE. Eddy current methods of nondestructive testing are used widely in industry to inspect a variety of nonferromagnetic and ferromagnetic materials. The development of a comprehensive POD model is therefore of significant importance. The model incorporates several sources of variability characterized by a multivariate Gaussian distribution and employs finite element analysis to predict the signal distribution. The method of mixtures is then used for estimating optimal threshold values. The research demonstrates the use of a finite element model within a probabilistic framework to the spread in the measured signal for eddy current nondestructive methods. Using the signal distributions for various flaw sizes the POD curves for varying defect parameters have been computed. In contrast to experimental POD models, the cost of generating such curves is very low and complex defect shapes can be handled very easily. The results are also operator independent.

  2. Adding a visual linear scale probability to the PIOPED probability of pulmonary embolism.

    Science.gov (United States)

    Christiansen, F; Nilsson, T; Måre, K; Carlsson, A

    1997-05-01

    Reporting a lung scintigraphy diagnosis as a PIOPED categorical probability of pulmonary embolism offers the clinician a wide range of interpretation. Therefore the purpose of this study was to analyze the impact on lung scintigraphy reporting of adding a visual linear scale (VLS) probability assessment to the ordinary PIOPED categorical probability. The study material was a re-evaluation of lung scintigrams from a prospective study of 170 patients. All patients had been examined by lung scintigraphy and pulmonary angiography. The scintigrams were re-evaluated by 3 raters, and the probability of pulmonary embolism was estimated by the PIOPED categorization and by a VLS probability. The test was repeated after 6 months. There was no significant difference (p > 0.05) in the area under the ROC curve between the PIOPED categorization and the VLS for any of the 3 raters. Analysis of agreement among raters and for repeatability demonstrated low agreement in the mid-range of probabilities. A VLS probability estimate did not significantly improve the overall accuracy of the diagnosis compared to the categorical PIOPED probability assessment alone. From the data of our present study we cannot recommend the addition of a VLS score to the PIOPED categorization.

  3. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  4. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  5. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  6. Strange Curves, Counting Rabbits, & Other Mathematical Explorations

    CERN Document Server

    Ball, Keith

    2011-01-01

    How does mathematics enable us to send pictures from space back to Earth? Where does the bell-shaped curve come from? Why do you need only 23 people in a room for a 50/50 chance of two of them sharing the same birthday? In Strange Curves, Counting Rabbits, and Other Mathematical Explorations, Keith Ball highlights how ideas, mostly from pure math, can answer these questions and many more. Drawing on areas of mathematics from probability theory, number theory, and geometry, he explores a wide range of concepts, some more light-hearted, others central to the development of the field and used dai

  7. Investigation of learning and experience curves

    Energy Technology Data Exchange (ETDEWEB)

    Krawiec, F.; Thornton, J.; Edesess, M.

    1980-04-01

    The applicability of learning and experience curves for predicting future costs of solar technologies is assessed, and the major test case is the production economics of heliostats. Alternative methods for estimating cost reductions in systems manufacture are discussed, and procedures for using learning and experience curves to predict costs are outlined. Because adequate production data often do not exist, production histories of analogous products/processes are analyzed and learning and aggregated cost curves for these surrogates estimated. If the surrogate learning curves apply, they can be used to estimate solar technology costs. The steps involved in generating these cost estimates are given. Second-generation glass-steel and inflated-bubble heliostat design concepts, developed by MDAC and GE, respectively, are described; a costing scenario for 25,000 units/yr is detailed; surrogates for cost analysis are chosen; learning and aggregate cost curves are estimated; and aggregate cost curves for the GE and MDAC designs are estimated. However, an approach that combines a neoclassical production function with a learning-by-doing hypothesis is needed to yield a cost relation compatible with the historical learning curve and the traditional cost function of economic theory.

  8. Capacity theory on algebraic curves

    CERN Document Server

    Rumely, Robert S

    1989-01-01

    Capacity is a measure of size for sets, with diverse applications in potential theory, probability and number theory. This book lays foundations for a theory of capacity for adelic sets on algebraic curves. Its main result is an arithmetic one, a generalization of a theorem of Fekete and Szegö which gives a sharp existence/finiteness criterion for algebraic points whose conjugates lie near a specified set on a curve. The book brings out a deep connection between the classical Green's functions of analysis and Néron's local height pairings; it also points to an interpretation of capacity as a kind of intersection index in the framework of Arakelov Theory. It is a research monograph and will primarily be of interest to number theorists and algebraic geometers; because of applications of the theory, it may also be of interest to logicians. The theory presented generalizes one due to David Cantor for the projective line. As with most adelic theories, it has a local and a global part. Let /K be a smooth, complet...

  9. Generation of material stress-strain curves for the parametric study of pipeline buckling[Includes the CSCE forum on professional practice and career development : 1. international engineering mechanics and materials specialty conference : 1. international/3. coastal, estuarine and offshore engineering specialty conference : 2. international/8. construction specialty conference

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Z.; Khoo, H. [Carleton Univ., Ottawa, ON (Canada). Dept. of Civil and Environmental Engineering

    2009-07-01

    Failures in steel pipelines are typically preceded by inelastic local buckling which depends on the shape of the material stress-strain curve. Non-dimensional equations can be used to quantify and predict the buckling limits for a range of stress-strain curves found in pipes through parametric finite element analyses. This paper evaluated the feasibility and ease of using an equation to generate the stress-strain curve for a pipe buckling parametric study. A power-law based equation for generating the true stress-true plastic strain curve was adopted. The equation made it possible to quantify the local buckling response to various material stress-strain curves using only a few parameters, such as ultimate to yield strength (proportional limit) ratio, and strain at ultimate stress measured from the end of yield plateau. The same parameters can therefore be used for different material yield strength and length of yield plateau, and enable the development of a more compact material property dependent non-dimensional buckling limit equation. 6 refs., 6 tabs., 3 figs.

  10. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  11. Surface growth kinematics via local curve evolution

    KAUST Repository

    Moulton, Derek E.

    2012-11-18

    A mathematical framework is developed to model the kinematics of surface growth for objects that can be generated by evolving a curve in space, such as seashells and horns. Growth is dictated by a growth velocity vector field defined at every point on a generating curve. A local orthonormal basis is attached to each point of the generating curve and the velocity field is given in terms of the local coordinate directions, leading to a fully local and elegant mathematical structure. Several examples of increasing complexity are provided, and we demonstrate how biologically relevant structures such as logarithmic shells and horns emerge as analytical solutions of the kinematics equations with a small number of parameters that can be linked to the underlying growth process. Direct access to cell tracks and local orientation enables for connections to be made to the underlying growth process. © 2012 Springer-Verlag Berlin Heidelberg.

  12. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  13. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  14. Probabilities for Solar Siblings

    CERN Document Server

    Valtonen, M; Bobylev, V V; Myllari, A

    2015-01-01

    We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  15. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  16. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  17. Moduli of Trigonal Curves

    CERN Document Server

    Stankova-Frenkel, Z E

    1997-01-01

    We study the moduli of trigonal curves. We establish the exact upper bound of ${36(g+1)}/(5g+1)$ for the slope of trigonal fibrations. Here, the slope of any fibration $X\\to B$ of stable curves with smooth general member is the ratio Hodge class $\\lambda$ on the moduli space $\\bar{\\mathfrak{M}}_g$ to the base $B$. We associate to a trigonal family $X$ a canonical rank two vector bundle $V$, and show that for Bogomolov-semistable $V$ the slope satisfies the stronger inequality ${\\delta_B}/{\\lambda_B}\\leq 7+{6}/{g}$. We further describe the rational Picard group of the {trigonal} locus $\\bar{\\mathfrak T}_g$ in the moduli space $\\bar{\\mathfrak{M}}_g$ of genus $g$ curves. In the even genus case, we interpret the above Bogomolov semistability condition in terms of the so-called Maroni divisor in $\\bar{\\mathfrak T}_g$.

  18. Power Curve Measurements REWS

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere......This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...

  19. The sales learning curve.

    Science.gov (United States)

    Leslie, Mark; Holloway, Charles A

    2006-01-01

    When a company launches a new product into a new market, the temptation is to immediately ramp up sales force capacity to gain customers as quickly as possible. But hiring a full sales force too early just causes the firm to burn through cash and fail to meet revenue expectations. Before it can sell an innovative product efficiently, the entire organization needs to learn how customers will acquire and use it, a process the authors call the sales learning curve. The concept of a learning curve is well understood in manufacturing. Employees transfer knowledge and experience back and forth between the production line and purchasing, manufacturing, engineering, planning, and operations. The sales learning curve unfolds similarly through the give-and-take between the company--marketing, sales, product support, and product development--and its customers. As customers adopt the product, the firm modifies both the offering and the processes associated with making and selling it. Progress along the manufacturing curve is measured by tracking cost per unit: The more a firm learns about the manufacturing process, the more efficient it becomes, and the lower the unit cost goes. Progress along the sales learning curve is measured in an analogous way: The more a company learns about the sales process, the more efficient it becomes at selling, and the higher the sales yield. As the sales yield increases, the sales learning process unfolds in three distinct phases--initiation, transition, and execution. Each phase requires a different size--and kind--of sales force and represents a different stage in a company's production, marketing, and sales strategies. Adjusting those strategies as the firm progresses along the sales learning curve allows managers to plan resource allocation more accurately, set appropriate expectations, avoid disastrous cash shortfalls, and reduce both the time and money required to turn a profit.

  20. Algebraic curves and cryptography

    CERN Document Server

    Murty, V Kumar

    2010-01-01

    It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on

  1. Power Curve Measurements REWS

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Villanueva, Héctor

    This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...

  2. Power curve investigation

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Vesth, Allan

    are not performed according to IEC 61400-12-1 [1]. Therefore, the results presented in this report cannot be considered a power curve according to the reference standard, and are referred to as “power curve investigation” instead. The measurements have been performed by a customer and the data analysis has been......This report describes the analysis carried out with data from a given turbine in a wind farm and a chosen period. The purpose of the analysis is to correlate the power output of the wind turbine to the wind speed measured by a nacelle-mounted anemometer. The measurements and analysis...

  3. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  4. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  5. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  6. Paths of algebraic hyperbolic curves

    Institute of Scientific and Technical Information of China (English)

    Ya-juan LI; Li-zheng LU; Guo-zhao WANG

    2008-01-01

    Cubic algebraic hyperbolic (AH) Bezier curves and AH spline curves are defined with a positive parameter α in the space spanned by {1, t, sinht, cosht}. Modifying the value of α yields a family of AH Bezier or spline curves with the family parameter α. For a fixed point on the original curve, it will move on a defined curve called "path of AH curve" (AH Bezier and AH spline curves) when α changes. We describe the geometric effects of the paths and give a method to specify a curve passing through a given point.

  7. Nacelle lidar power curve

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Wagner, Rozenn

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  8. Graphs, Curves and Dynamics

    NARCIS (Netherlands)

    Kool, J.

    2013-01-01

    This thesis has three main subjects. The first subject is Measure-theoretic rigidity of Mumford Curves. One can describe isomorphism of two compact hyperbolic Riemann surfaces of the same genus by a measure-theoretic property: a chosen isomorphism of their fundamental groups corresponds to a homeomo

  9. Power Curve Measurements

    DEFF Research Database (Denmark)

    Vesth, Allan; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present anal...

  10. Power Curve Measurements

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  11. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Villanueva, Héctor

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  12. Fitting a Gompertz curve

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1994-01-01

    textabstractIn this paper, a simple Gompertz curve-fitting procedure is proposed. Its advantages include the facts that the stability of the saturation level over the sample period can be checked, and that no knowledge of its value is necessary for forecasting. An application to forecasting the stoc

  13. Gompertz curves with seasonality

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1994-01-01

    textabstractThis paper considers an extension of the usual Gompertz curve by allowing the parameters to vary over the seasons. This means that, for example, saturation levels can be different over the year. An estimation and testing method is proposed and illustrated with an example.

  14. Power Curve Measurements

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  15. Power Curve Measurements, FGW

    DEFF Research Database (Denmark)

    Vesth, Allan; Yordanova, Ginka

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  16. Graphing Polar Curves

    Science.gov (United States)

    Lawes, Jonathan F.

    2013-01-01

    Graphing polar curves typically involves a combination of three traditional techniques, all of which can be time-consuming and tedious. However, an alternative method--graphing the polar function on a rectangular plane--simplifies graphing, increases student understanding of the polar coordinate system, and reinforces graphing techniques learned…

  17. Power Curve Measurements, REWS

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere...

  18. Power Curve Measurements

    DEFF Research Database (Denmark)

    Federici, Paolo; Kock, Carsten Weber

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  19. Straightening Out Learning Curves

    Science.gov (United States)

    Corlett, E. N.; Morecombe, V. J.

    1970-01-01

    The basic mathematical theory behind learning curves is explained, together with implications for clerical and industrial training, evaluation of skill development, and prediction of future performance. Brief studies of textile worker and typist training are presented to illustrate such concepts as the reduction fraction (a consistent decrease in…

  20. Carbon Lorenz Curves

    NARCIS (Netherlands)

    Groot, L.F.M.

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across

  1. Power Curve Measurements

    DEFF Research Database (Denmark)

    Kock, Carsten Weber; Federici, Paolo

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  2. Power Curve Measurements, FGW

    DEFF Research Database (Denmark)

    Kock, Carsten Weber; Vesth, Allan

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  3. Detection of Periodic Variability in Simulated QSO Light Curves

    CERN Document Server

    Westman, David B; Ivezic, Zeljko

    2010-01-01

    Periodic light curve behavior predicted for some binary black hole systems might be detected in large samples, such as the multi-million quasar sample expected from the Large Synoptic Survey Telescope (LSST). We investigate the false-alarm probability for the discovery of a periodic signal in light curves simulated using damped random walk (DRW) model. This model provides a good description of observed light curves, and does not include periodic behavior. We used the Lomb-Scargle periodogram to search for a periodic signal in a million simulated light curves that properly sample the DRW parameter space, and the LSST cadence space. We find that even a very conservative threshold for the false-alarm probability still yields thousands of "good" binary black hole candidates. We conclude that the future claims for binary black holes based on Lomb-Scargle analysis of LSST light curves will have to be interpreted with caution.

  4. Probability state modeling theory.

    Science.gov (United States)

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  5. Probability distributions for magnetotellurics

    Energy Technology Data Exchange (ETDEWEB)

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  6. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  7. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  8. Probability Density Function Characterization for Aggregated Large-Scale Wind Power Based on Weibull Mixtures

    Directory of Open Access Journals (Sweden)

    Emilio Gómez-Lázaro

    2016-02-01

    Full Text Available The Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC and the Bayesian information criterion (BIC. Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.

  9. Probability landscapes for integrative genomics

    Directory of Open Access Journals (Sweden)

    Benecke Arndt

    2008-05-01

    Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a

  10. Probability density function modeling for sub-powered interconnects

    Science.gov (United States)

    Pater, Flavius; Amaricǎi, Alexandru

    2016-06-01

    This paper proposes three mathematical models for reliability probability density function modeling the interconnect supplied at sub-threshold voltages: spline curve approximations, Gaussian models,and sine interpolation. The proposed analysis aims at determining the most appropriate fitting for the switching delay - probability of correct switching for sub-powered interconnects. We compare the three mathematical models with the Monte-Carlo simulations of interconnects for 45 nm CMOS technology supplied at 0.25V.

  11. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  12. Carbon Lorenz Curves

    Energy Technology Data Exchange (ETDEWEB)

    Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)

    2008-11-15

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.

  13. Managing curved canals

    Directory of Open Access Journals (Sweden)

    Iram Ansari

    2012-01-01

    Full Text Available Dilaceration is the result of a developmental anomaly in which there has been an abrupt change in the axial inclination between the crown and the root of a tooth. Dilaceration can be seen in both the permanent and deciduous dentitions, and is more commonly found in posterior teeth and in maxilla. Periapical radiographs are the most appropriate way to diagnose the presence of root dilacerations. The controlled regularly tapered preparation of the curved canals is the ultimate challenge in endodontics. Careful and meticulous technique will yield a safe and sufficient enlargement of the curved canals. This article gives a review of the literature and three interesting case reports of root dilacerations.

  14. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  15. Dynamics of curved fronts

    CERN Document Server

    Pelce, Pierre

    1989-01-01

    In recent years, much progress has been made in the understanding of interface dynamics of various systems: hydrodynamics, crystal growth, chemical reactions, and combustion. Dynamics of Curved Fronts is an important contribution to this field and will be an indispensable reference work for researchers and graduate students in physics, applied mathematics, and chemical engineering. The book consist of a 100 page introduction by the editor and 33 seminal articles from various disciplines.

  16. Estimating Corporate Yield Curves

    OpenAIRE

    Antionio Diaz; Frank Skinner

    2001-01-01

    This paper represents the first study of retail deposit spreads of UK financial institutions using stochastic interest rate modelling and the market comparable approach. By replicating quoted fixed deposit rates using the Black Derman and Toy (1990) stochastic interest rate model, we find that the spread between fixed and variable rates of interest can be modeled (and priced) using an interest rate swap analogy. We also find that we can estimate an individual bank deposit yield curve as a spr...

  17. Scenario Probability-based Siting and Sizing of Wind Turbine Generators%基于场景概率的风电机组的选址和定容

    Institute of Scientific and Technical Information of China (English)

    刘苏云; 王笛; 蒋丹; 周竞; 史静; 丁晓群

    2014-01-01

    Based on the characteristics of wind velocity,this paper proposes scenario probability-based wind turbine siting and sizing.The function is established with the aim of minimizing total investment and annual energy loss.In the meantime,under consideration of various constraints,the application of the scheme in a random environment is assessed from the point of view of scenario probability, and improved PSO is used to solve this problem.Finally,IEEE33 node is taken as an example to verify the effectiveness and feasibility of the model and the algorithm.%根据风速特点提出基于场景概率的风机选址和定容,主要以风电机组总投资成本最小,年电能损失费用最小为目标函数建立,同时考虑各种约束条件,从场景发生概率角度评估规划方案在随机环境下的适用性,并使用改进粒子群算法来解决此问题。最后以IEEE33节点为例验证了模型和算法的有效性和可行性。

  18. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  19. Atypical Light Curves

    CERN Document Server

    Steenwyk, Steven D; Molnar, Lawrence A

    2013-01-01

    We have identified some two-hundred new variable stars in a systematic study of a data archive obtained with the Calvin-Rehoboth observatory. Of these, we present five close binaries showing behaviors presumably due to star spots or other magnetic activity. For context, we first present two new RS CVn systems whose behavior can be readily attribute to star spots. Then we present three new close binary systems that are rather atypical, with light curves that are changing over time in ways not easily understood in terms of star spot activity generally associated with magnetically active binary systems called RS CVn systems. Two of these three are contact binaries that exhibit gradual changes in average brightness without noticeable changes in light curve shape. A third system has shown such large changes in light curve morphology that we speculate this may be a rare instance of a system that transitions back and forth between contact and noncontact configurations, perhaps driven by magnetic cycles in at least o...

  20. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  1. Conditionals, probability, and belief revision

    NARCIS (Netherlands)

    Voorbraak, F.

    1989-01-01

    A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called

  2. Tempo curves considered harmful (part 2)

    NARCIS (Netherlands)

    Desain, P.; Honing, H.

    1991-01-01

    A column (the second of a series of three) constitutes an abridged and adapted version of Tempo curves considered harmful . M (an amateur mathematician) and P (a would-be psychologist) incorporated some generative models for expressive timing in their sequencer program. This proved partially succesf

  3. Flow characteristics of curved ducts

    Directory of Open Access Journals (Sweden)

    Rudolf P.

    2007-10-01

    Full Text Available Curved channels are very often present in real hydraulic systems, e.g. curved diffusers of hydraulic turbines, S-shaped bulb turbines, fittings, etc. Curvature brings change of velocity profile, generation of vortices and production of hydraulic losses. Flow simulation using CFD techniques were performed to understand these phenomena. Cases ranging from single elbow to coupled elbows in shapes of U, S and spatial right angle position with circular cross-section were modeled for Re = 60000. Spatial development of the flow was studied and consequently it was deduced that minor losses are connected with the transformation of pressure energy into kinetic energy and vice versa. This transformation is a dissipative process and is reflected in the amount of the energy irreversibly lost. Least loss coefficient is connected with flow in U-shape elbows, biggest one with flow in Sshape elbows. Finally, the extent of the flow domain influenced by presence of curvature was examined. This isimportant for proper placement of mano- and flowmeters during experimental tests. Simulations were verified with experimental results presented in literature.

  4. Tsunami probability in the Caribbean region

    Science.gov (United States)

    Parsons, T.; Geist, E. L.

    2008-12-01

    We calculated tsunami runup probability at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20km by 20km cells, and the mean tsunami runup rate was determined for each cell. A remarkable ~500-year empirical record was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it's unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c=0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack back-arc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20km by 20km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0-30 percent regionally.

  5. Drawable Region of the Generalized Log Aesthetic Curves

    Directory of Open Access Journals (Sweden)

    R. U. Gobithaasan

    2013-01-01

    Full Text Available The main characteristic of visually pleasing curves used for product design is a monotonic curvature profile. Recently, a planar curve called Generalized Log Aesthetic Curve (GLAC has been extended from the Log Aesthetic Curve (LAC, and it has an additional shape parameter, ν. This curve preserves the monotonicity of curvature and is said to produce visually pleasing curves. This paper delves on the drawable region of the GLAC segment which indicates the probable solutions of shape parameters from given interpolating points and the direction of travel at those points. The first section reviews the formulation of GLAC and its related bounds. The section describes the algorithm for identifying the drawable region. It is followed by the section describing how small changes of ν widen the drawable boundaries. The final section discusses the superiority of GLAC compared to LAC for use in industrial product design.

  6. CBS - A program for close binary system light curve analysis

    Science.gov (United States)

    Solmi, L.; Galli, M.

    CBS is a new program for binary system light curve analysis, it generates synthetic light curves for a binary system, accounting for eclipses, tidal distortion, limb darkening, gravity darkening and reflection; it is also possible to compute the light contribution and eclipses of an accretion disk. The bolometric light curve is generated, as well as curves for the U,B,V,R,I colour bands. In the following we give a brief description of the first version of the program and show some preliminary results.

  7. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  8. The Art of Probability Assignment

    CERN Document Server

    Dimitrov, Vesselin I

    2012-01-01

    The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.

  9. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  10. Probing exoplanet clouds with optical phase curves.

    Science.gov (United States)

    Muñoz, Antonio García; Isaak, Kate G

    2015-11-01

    Kepler-7b is to date the only exoplanet for which clouds have been inferred from the optical phase curve--from visible-wavelength whole-disk brightness measurements as a function of orbital phase. Added to this, the fact that the phase curve appears dominated by reflected starlight makes this close-in giant planet a unique study case. Here we investigate the information on coverage and optical properties of the planet clouds contained in the measured phase curve. We generate cloud maps of Kepler-7b and use a multiple-scattering approach to create synthetic phase curves, thus connecting postulated clouds with measurements. We show that optical phase curves can help constrain the composition and size of the cloud particles. Indeed, model fitting for Kepler-7b requires poorly absorbing particles that scatter with low-to-moderate anisotropic efficiency, conclusions consistent with condensates of silicates, perovskite, and silica of submicron radii. We also show that we are limited in our ability to pin down the extent and location of the clouds. These considerations are relevant to the interpretation of optical phase curves with general circulation models. Finally, we estimate that the spherical albedo of Kepler-7b over the Kepler passband is in the range 0.4-0.5.

  11. Magnetism in curved geometries

    Science.gov (United States)

    Streubel, Robert; Fischer, Peter; Kronast, Florian; Kravchuk, Volodymyr P.; Sheka, Denis D.; Gaididei, Yuri; Schmidt, Oliver G.; Makarov, Denys

    2016-09-01

    Extending planar two-dimensional structures into the three-dimensional space has become a general trend in multiple disciplines, including electronics, photonics, plasmonics and magnetics. This approach provides means to modify conventional or to launch novel functionalities by tailoring the geometry of an object, e.g. its local curvature. In a generic electronic system, curvature results in the appearance of scalar and vector geometric potentials inducing anisotropic and chiral effects. In the specific case of magnetism, even in the simplest case of a curved anisotropic Heisenberg magnet, the curvilinear geometry manifests two exchange-driven interactions, namely effective anisotropy and antisymmetric exchange, i.e. Dzyaloshinskii-Moriya-like interaction. As a consequence, a family of novel curvature-driven effects emerges, which includes magnetochiral effects and topologically induced magnetization patterning, resulting in theoretically predicted unlimited domain wall velocities, chirality symmetry breaking and Cherenkov-like effects for magnons. The broad range of altered physical properties makes these curved architectures appealing in view of fundamental research on e.g. skyrmionic systems, magnonic crystals or exotic spin configurations. In addition to these rich physics, the application potential of three-dimensionally shaped objects is currently being explored as magnetic field sensorics for magnetofluidic applications, spin-wave filters, advanced magneto-encephalography devices for diagnosis of epilepsy or for energy-efficient racetrack memory devices. These recent developments ranging from theoretical predictions over fabrication of three-dimensionally curved magnetic thin films, hollow cylinders or wires, to their characterization using integral means as well as the development of advanced tomography approaches are in the focus of this review.

  12. Simulations of Closed Timelike Curves

    Science.gov (United States)

    Brun, Todd A.; Wilde, Mark M.

    2017-03-01

    Proposed models of closed timelike curves (CTCs) have been shown to enable powerful information-processing protocols. We examine the simulation of models of CTCs both by other models of CTCs and by physical systems without access to CTCs. We prove that the recently proposed transition probability CTCs (T-CTCs) are physically equivalent to postselection CTCs (P-CTCs), in the sense that one model can simulate the other with reasonable overhead. As a consequence, their information-processing capabilities are equivalent. We also describe a method for quantum computers to simulate Deutschian CTCs (but with a reasonable overhead only in some cases). In cases for which the overhead is reasonable, it might be possible to perform the simulation in a table-top experiment. This approach has the benefit of resolving some ambiguities associated with the equivalent circuit model of Ralph et al. Furthermore, we provide an explicit form for the state of the CTC system such that it is a maximum-entropy state, as prescribed by Deutsch.

  13. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  14. Hidden Variables or Positive Probabilities?

    CERN Document Server

    Rothman, T; Rothman, Tony

    2001-01-01

    Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...

  15. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  16. Predicting the probability of outbreeding depression.

    Science.gov (United States)

    Frankham, Richard; Ballou, Jonathan D; Eldridge, Mark D B; Lacy, Robert C; Ralls, Katherine; Dudash, Michele R; Fenster, Charles B

    2011-06-01

    Fragmentation of animal and plant populations typically leads to genetic erosion and increased probability of extirpation. Although these effects can usually be reversed by re-establishing gene flow between population fragments, managers sometimes fail to do so due to fears of outbreeding depression (OD). Rapid development of OD is due primarily to adaptive differentiation from selection or fixation of chromosomal variants. Fixed chromosomal variants can be detected empirically. We used an extended form of the breeders' equation to predict the probability of OD due to adaptive differentiation between recently isolated population fragments as a function of intensity of selection, genetic diversity, effective population sizes, and generations of isolation. Empirical data indicated that populations in similar environments had not developed OD even after thousands of generations of isolation. To predict the probability of OD, we developed a decision tree that was based on the four variables from the breeders' equation, taxonomic status, and gene flow within the last 500 years. The predicted probability of OD in crosses between two populations is elevated when the populations have at least one of the following characteristics: are distinct species, have fixed chromosomal differences, exchanged no genes in the last 500 years, or inhabit different environments. Conversely, the predicted probability of OD in crosses between two populations of the same species is low for populations with the same karyotype, isolated for <500 years, and that occupy similar environments. In the former case, we recommend crossing be avoided or tried on a limited, experimental basis. In the latter case, crossing can be carried out with low probability of OD. We used crosses with known results to test the decision tree and found that it correctly identified cases where OD occurred. Current concerns about OD in recently fragmented populations are almost certainly excessive. ©2011 Society for

  17. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  18. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  19. Synchrotron radiation from a curved plasma channel laser wakefield accelerator

    CERN Document Server

    Palastro, J P; Hafizi, B; Chen, Y -H; Johnson, L A; Penano, J R; Helle, M H; Mamonau, A A

    2016-01-01

    A laser pulse guided in a curved plasma channel can excite wakefields that steer electrons along an arched trajectory. As the electrons are accelerated along the curved channel, they emit synchrotron radiation. We present simple analytical models and simulations examining laser pulse guiding, wakefield generation, electron steering, and synchrotron emission in curved plasma channels. For experimentally realizable parameters, a ~2 GeV electron emits 0.1 photons per cm with an average photon energy of multiple keV.

  20. Superfluids in Curved Spacetime

    CERN Document Server

    Villegas, Kristian Hauser A

    2015-01-01

    Superfluids under an intense gravitational field are typically found in neutron star and quark star cores. Most treatments of these superfluids, however, are done in a flat spacetime background. In this paper, the effect of spacetime curvature on superfluidity is investigated. An effective four-fermion interaction is derived by integrating out the mediating scalar field. The fermions interacting via the mediating gauge vector bosons is also discussed. Two possible cases are considered in the mean-field treatment: antifermion-fermion and fermion-fermion pairings. An effective action, quadratic in fermion field, and a self-consistent equation are derived for both cases. The effective Euclidean action and the matrix elements of the heat kernel operator, which are very useful in curved-spacetime QFT calculations, are derived for the fermion-fermion pairing. Finally, explicit numerical calculation of the gravitational correction to the pairing order parameter is performed for the scalar superfluid case. It is foun...

  1. Polymers in Curved Boxes

    CERN Document Server

    Yaman, K; Solis, F J; Witten, T A

    1996-01-01

    We apply results derived in other contexts for the spectrum of the Laplace operator in curved geometries to the study of an ideal polymer chain confined to a spherical annulus in arbitrary space dimension D and conclude that the free energy compared to its value for an uncurved box of the same thickness and volume, is lower when $D < 3$, stays the same when $D = 3$, and is higher when lowers the effective bending elasticity of the walls, and might induce spontaneous symmetry breaking, i.e. bending. (Actually, the above mentioned results show that {\\em {any}} shell in $D = 3$ induces this effect, except for a spherical shell). We compute the contribution of this effect to the bending rigidities in the Helfrich free energy expression.

  2. Evolutes of Hyperbolic Plane Curves

    Institute of Scientific and Technical Information of China (English)

    Shyuichi IZUMIYA; Dong He PEI; Takashi SANO; Erika TORII

    2004-01-01

    We define the notion of evolutes of curves in a hyperbolic plane and establish the relationships between singularities of these subjects and geometric invariants of curves under the action of the Lorentz group. We also describe how we can draw the picture of an evolute of a hyperbolic plane curve in the Poincar(e) disk.

  3. The Arithmetic of Elliptic Curves

    CERN Document Server

    Silverman, Joseph H

    2009-01-01

    Treats the arithmetic theory of elliptic curves in its modern formulation, through the use of basic algebraic number theory and algebraic geometry. This book discusses the necessary algebro-geometric results, and offers an exposition of the geometry of elliptic curves, and the formal group of an elliptic curve.

  4. Curved-Duct

    Directory of Open Access Journals (Sweden)

    Je Hyun Baekt

    2000-01-01

    Full Text Available A numerical study is conducted on the fully-developed laminar flow of an incompressible viscous fluid in a square duct rotating about a perpendicular axis to the axial direction of the duct. At the straight duct, the rotation produces vortices due to the Coriolis force. Generally two vortex cells are formed and the axial velocity distribution is distorted by the effect of this Coriolis force. When a convective force is weak, two counter-rotating vortices are shown with a quasi-parabolic axial velocity profile for weak rotation rates. As the rotation rate increases, the axial velocity on the vertical centreline of the duct begins to flatten and the location of vorticity center is moved near to wall by the effect of the Coriolis force. When the convective inertia force is strong, a double-vortex secondary flow appears in the transverse planes of the duct for weak rotation rates but as the speed of rotation increases the secondary flow is shown to split into an asymmetric configuration of four counter-rotating vortices. If the rotation rates are increased further, the secondary flow restabilizes to a slightly asymmetric double-vortex configuration. Also, a numerical study is conducted on the laminar flow of an incompressible viscous fluid in a 90°-bend square duct that rotates about axis parallel to the axial direction of the inlet. At a 90°-bend square duct, the feature of flow by the effect of a Coriolis force and a centrifugal force, namely a secondary flow by the centrifugal force in the curved region and the Coriolis force in the downstream region, is shown since the centrifugal force in curved region and the Coriolis force in downstream region are dominant respectively.

  5. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  6. Energy dissipation in flows through curved spaces

    Science.gov (United States)

    Debus, J.-D.; Mendoza, M.; Succi, S.; Herrmann, H. J.

    2017-01-01

    Fluid dynamics in intrinsically curved geometries is encountered in many physical systems in nature, ranging from microscopic bio-membranes all the way up to general relativity at cosmological scales. Despite the diversity of applications, all of these systems share a common feature: the free motion of particles is affected by inertial forces originating from the curvature of the embedding space. Here we reveal a fundamental process underlying fluid dynamics in curved spaces: the free motion of fluids, in the complete absence of solid walls or obstacles, exhibits loss of energy due exclusively to the intrinsic curvature of space. We find that local sources of curvature generate viscous stresses as a result of the inertial forces. The curvature- induced viscous forces are shown to cause hitherto unnoticed and yet appreciable energy dissipation, which might play a significant role for a variety of physical systems involving fluid dynamics in curved spaces. PMID:28195148

  7. A Radiosity Solution for Curved Surface Environments

    Institute of Scientific and Technical Information of China (English)

    孙济洲; RichardL.Grimsdale

    1997-01-01

    Radiosity has been a popular method for photorealistic image generation.But the determination of form factors between curved patches is the most difficult and time consuming procedure,and also the errors caused by approximating source patch's radiosity with average values are obvious.In this paper,a radiosity algorithm for rendering curved surfaces represented by parameters is described.The contributed radiosity from differential areas on four vertices of the source patch to a receiving point is calculated firstly,then the contribution from the inner area of the source patch is evaluated by interpolating the values on four corners.Both the difficult problem of determining form-factors between curved surfaces and errors mentioned above have been avoided.Comparison of the experimental results using the new algorithm has been made with the ones obtained by traditional method.Some associated techniques such as the visibility test and the adaptive subdivision are also described.

  8. Intensity-Duration-Frequency (IDF) rainfall curves, for data series and climate projection in African cities.

    Science.gov (United States)

    De Paola, Francesco; Giugni, Maurizio; Topa, Maria Elena; Bucchignani, Edoardo

    2014-01-01

    Changes in the hydrologic cycle due to increase in greenhouse gases cause variations in intensity, duration, and frequency of precipitation events. Quantifying the potential effects of climate change and adapting to them is one way to reduce urban vulnerability. Since rainfall characteristics are often used to design water structures, reviewing and updating rainfall characteristics (i.e., Intensity-Duration-Frequency (IDF) curves) for future climate scenarios is necessary (Reg Environ Change 13(1 Supplement):25-33, 2013). The present study regards the evaluation of the IDF curves for three case studies: Addis Ababa (Ethiopia), Dar Es Salaam (Tanzania) and Douala (Cameroon). Starting from daily rainfall observed data, to define the IDF curves and the extreme values in a smaller time window (10', 30', 1 h, 3 h, 6 h, 12 h), disaggregation techniques of the collected data have been used, in order to generate a synthetic sequence of rainfall, with statistical properties similar to the recorded data. Then, the rainfall pattern of the three test cities was analyzed and IDF curves were evaluated. In order to estimate the contingent influence of climate change on the IDF curves, the described procedure was applied to the climate (rainfall) simulations over the time period 2010-2050, provided by CMCC (Centro Euro-Mediterraneo sui Cambiamenti Climatici). The evaluation of the IDF curves allowed to frame the rainfall evolution of the three case studies, considering initially only historical data, then taking into account the climate projections, in order to verify the changes in rainfall patterns. The same set of data and projections was also used for evaluating the Probable Maximum Precipitation (PMP).

  9. Understanding Students' Beliefs about Probability.

    Science.gov (United States)

    Konold, Clifford

    The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…

  10. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  11. Varieties of Belief and Probability

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik

    2015-01-01

    htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for

  12. Landau-Zener Probability Reviewed

    CERN Document Server

    Valencia, C

    2008-01-01

    We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.

  13. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  14. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  15. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  16. Linear Positivity and Virtual Probability

    CERN Document Server

    Hartle, J B

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...

  17. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  18. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  19. Probing exoplanet clouds with optical phase curves

    CERN Document Server

    Munoz, A Garcia

    2015-01-01

    Kepler-7b is to date the only exoplanet for which clouds have been inferred from the optical phase curve -- from visible-wavelength whole-disk brightness measurements as a function of orbital phase. Added to this, the fact that the phase curve appears dominated by reflected starlight makes this close-in giant planet a unique study case. Here we investigate the information on coverage and optical properties of the planet clouds contained in the measured phase curve. We generate cloud maps of Kepler-7b and use a multiple-scattering approach to create synthetic phase curves, thus connecting postulated clouds with measurements. We show that optical phase curves can help constrain the composition and size of the cloud particles. Indeed, model fitting for Kepler-7b requires poorly absorbing particles that scatter with low-to-moderate anisotropic efficiency, conclusions consistent with condensates of silicates, perovskite, and silica of submicron radii. We also show that we are limited in our ability to pin down the...

  20. Asteroid taxonomic signatures from photometric phase curves

    CERN Document Server

    Oszkiewicz, D A; Wasserman, L H; Muinonen, K; Penttilä, A; Pieniluoma, T; Trilling, D E; Thomas, C A

    2012-01-01

    We explore the correlation between an asteroid's taxonomy and photometric phase curve using the H, G12 photometric phase function, with the shape of the phase function described by the single parameter G12. We explore the usability of G12 in taxonomic classification for individual objects, asteroid families, and dynamical groups. We conclude that the mean values of G12 for the considered taxonomic complexes are statistically different, and also discuss the overall shape of the G12 distribution for each taxonomic complex. Based on the values of G12 for about half a million asteroids, we compute the probabilities of C, S, and X complex membership for each asteroid. For an individual asteroid, these probabilities are rather evenly distributed over all of the complexes, thus preventing meaningful classification. We then present and discuss the G12 distributions for asteroid families, and predict the taxonomic complex preponderance for asteroid families given the distribution of G12 in each family. For certain ast...

  1. Caloric curves of atomic nuclei and other small systems

    CERN Document Server

    Schiller, A; Hjorth-Jensen, M; Rekstad, J; Siem, S

    2003-01-01

    Caloric curves have traditionally been derived within the microcanonical ensemble via dS/dE=1/T or within the canonical ensemble via E=T^2*d(ln Z)/dT. In the thermodynamical limit, i.e., for large systems, both caloric curves give the same result. For small systems like nuclei, the two caloric curves are in general different from each other and neither one is reasonable. Using dS/dE=1/T, spurious structures like negative temperatures and negative heat capacities can occur and have indeed been discussed in the literature. Using E=T^2*d(ln Z)/dT a very featureless caloric curve is obtained which generally smoothes too much over structural changes in the system. A new approach for caloric curves based on the two-dimensional probability distribution P(E,T) will be discussed.

  2. Soil Water Retention Curve

    Science.gov (United States)

    Johnson, L. E.; Kim, J.; Cifelli, R.; Chandra, C. V.

    2016-12-01

    Potential water retention, S, is one of parameters commonly used in hydrologic modeling for soil moisture accounting. Physically, S indicates total amount of water which can be stored in soil and is expressed in units of depth. S can be represented as a change of soil moisture content and in this context is commonly used to estimate direct runoff, especially in the Soil Conservation Service (SCS) curve number (CN) method. Generally, the lumped and the distributed hydrologic models can easily use the SCS-CN method to estimate direct runoff. Changes in potential water retention have been used in previous SCS-CN studies; however, these studies have focused on long-term hydrologic simulations where S is allowed to vary at the daily time scale. While useful for hydrologic events that span multiple days, the resolution is too coarse for short-term applications such as flash flood events where S may not recover its full potential. In this study, a new method for estimating a time-variable potential water retention at hourly time-scales is presented. The methodology is applied for the Napa River basin, California. The streamflow gage at St Helena, located in the upper reaches of the basin, is used as the control gage site to evaluate the model performance as it is has minimal influences by reservoirs and diversions. Rainfall events from 2011 to 2012 are used for estimating the event-based SCS CN to transfer to S. As a result, we have derived the potential water retention curve and it is classified into three sections depending on the relative change in S. The first is a negative slope section arising from the difference in the rate of moving water through the soil column, the second is a zero change section representing the initial recovery the potential water retention, and the third is a positive change section representing the full recovery of the potential water retention. Also, we found that the soil water moving has traffic jam within 24 hours after finished first

  3. Intensity-Duration-Frequency (IDF) rainfall curves, for data series and climate projection in African cities

    Science.gov (United States)

    De Paola, Francesco; Giugni, Maurizio; Topa, Maria Elena; Coly, Adrien; Yeshitela, Kumelachew; Kombe, Wilbard; Tonye, Emmanuel; Touré, Hamidou

    2013-04-01

    The intensity-duration-frequency curves are used in hydrology to express in a synthetic way, the link between the maximum rainfall height h and a generic duration d of a rainfall event, fixed a given return period T. Generally, IDF curves can be characterized by a bi-parameter power law: h(d,T) = a(T)dn where a(T), and n are the parameters that have to be estimated through a probabilistic approach. An intensity-duration-frequency analysis starts by gathering time series record of different durations and extracting annual extremes for each duration. The annual extreme data are then fitted by a probability distribution. The present study, carried out within the FP7-ENV-2010 CLUVA project (CLimate change and Urban Vulnerability in Africa), regards the evaluation of the IDF curves for five case studies: Addis Ababa (Ethiopia), Dar Es Salaam (Tanzania), Douala (Cameroon), Ouagadouogou (Burkina Faso) and Saint Louis (Senegal). The probability distribution chosen to fit the annual extreme data is the classic Gumbel distribution. However, for the case studies, only the maximum annual daily rainfall heights are available. Therefore, to define the IDF curves and the extreme values in a smaller time window (10', 30', 1h, 3h, 6h, 12h), it is required to develop disaggregation techniques of the collected data, in order to generate a synthetic sequence of rainfall, with statistical properties equal to the recorded data. The daily rainfalls were disaggregated using two models: short-time intensity disaggregation model (10', 30', 1h); cascade-based disaggregation model (3h, 6h, 12h). On the basis of disaggegation models and Gumbel distribution , the parameters of the IDF curves for the five test cities were evaluated. In order to estimate the contingent influence of climate change on the IDF curves, the illustrated procedure has been applied to the climate (rainfall) simulations over the time period 2010-2050 provided by the CMCC (Centro Euro-Mediterraneo sui Cambiamenti Climatici

  4. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  5. Potential energy curves for neutral and multiply charged carbon monoxide

    Indian Academy of Sciences (India)

    Pradeep Kumar; N Sathyamurthy

    2010-01-01

    Potential energy curves of various electronic states of CO+ (0 ≤ ≤ 6) are generated at MRCI/CASSCF level using cc-pvQZ basis set and the results are compared with available experimental and theoretical data.

  6. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  7. Probability Ranking in Vector Spaces

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.

  8. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  9. Local Causality, Probability and Explanation

    CERN Document Server

    Healey, Richard A

    2016-01-01

    In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.

  10. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  11. Cubic B-spline curve approximation by curve unclamping

    OpenAIRE

    Chen, Xiao-Diao; Ma, Weiyin; Paul, Jean-Claude

    2010-01-01

    International audience; A new approach for cubic B-spline curve approximation is presented. The method produces an approximation cubic B-spline curve tangent to a given curve at a set of selected positions, called tangent points, in a piecewise manner starting from a seed segment. A heuristic method is provided to select the tangent points. The first segment of the approximation cubic B-spline curve can be obtained using an inner point interpolation method, least-squares method or geometric H...

  12. Parametrization of $\\epsilon$-rational curves: error analysis

    CERN Document Server

    Rueda, Sonia L

    2010-01-01

    In [Computer Aided Geometric Design 27 (2010), 212-231] the authors present an algorithm to parametrize approximately $\\epsilon$-rational curves, and they show in 2 examples that the Hausdorff distance, w.r.t. to the Euclidean distance, between the input and output curves is small. In this paper, we analyze this distance for a whole family of curves randomly generated and we automatize the strategy used in [Computer Aided Geometric Design 27 (2010), 212-231]. We find a reasonable upper bound of the Hausdorff distance between each input and output curve of the family.

  13. Refined curve counting on complex surfaces

    OpenAIRE

    Göttsche, Lothar; Shende, Vivek

    2012-01-01

    We define refined invariants which "count" nodal curves in sufficiently ample linear systems on surfaces, conjecture that their generating function is multiplicative, and conjecture explicit formulas in the case of K3 and abelian surfaces. We also give a refinement of the Caporaso-Harris recursion, and conjecture that it produces the same invariants in the sufficiently ample setting. The refined recursion specializes at y = -1 to the Itenberg-Kharlamov-Shustin recursion for Welschinger invari...

  14. 概率统计中随机模拟实验的设计——兼谈eviews随机数发生器的使用%On the Design of Stochastic Simulation Experiment in the Probability Statistics --Also on the use of of eviews random number generator

    Institute of Scientific and Technical Information of China (English)

    莫达隆

    2012-01-01

    Stochastic simulation experiment is a powerful tool for teaching probability statistics. Based on the needs of the teaching of probability statistics, this paper highlights the use of statistical software eviews random number generator, and accordingly gives some of the algorithms and procedures, and expand the statistical software in the use of the mathematical experiment. This instructional design will help deepen the students understanding of the concept of probability statistics and improve the capacity of their hands.%随机模拟实验是概率统计教学的有力工具,文章结合概率统计教学的需要,重点介绍了统计软件eviews中随机数发生器的使用,并相应给出了一些算法和程序,拓展了统计软件在数学实验中的运用,这种教学设计,将会帮助学生加深概率统计概念的理解和提高其实际动手的能力。

  15. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  16. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  17. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  18. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  19. The probabilities of unique events.

    Directory of Open Access Journals (Sweden)

    Sangeet S Khemlani

    Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.

  20. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  1. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  2. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Three lectures on free probability

    OpenAIRE

    2012-01-01

    These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.

  4. Mapping of iso exposure curves generated by conventional mobile radiodiagnostic equipment and dose in hospitalized patients; Mapeamento de curvas de isoexposicao geradas por equipamentos de radiodiagnostico moveis convencionais e dose em pacientes hospitalizados

    Energy Technology Data Exchange (ETDEWEB)

    Hoff, Gabriela; Fischer, Andreia Caroline Fischer da Silveira; Accurso, Andre, E-mail: andreia.silveira.001@acad.pucrs.b, E-mail: andre.accurso@acad.pucrs.b [Pontificia Universidade Catolica do Rio Grande do Sul (PUC/RS), Porto Alegre, RS (Brazil). Grupo de Experimentacao e Simulacacao Computacional em Fisica Medica; Andrade, Jose Rodrigo Mendes, E-mail: jose.andrade@santacasa.tche.b [Irmandade da Santa Casa de Misericordia de Porto Alegre, RS (Brazil). Servico de Atencao a Saude e Qualidade de Vida; Bacelar, Alexandre, E-mail: abacelar@hcpa.ufrgs.b [Hospital de Clinicas de Porto Alegre, RS (Brazil). Setor de Fisica Medica e Radioprotecao

    2011-10-26

    This paper intended to measure iso expositions curves in areas of mobile equipment use. It was selected: a Shimadzu mobile equipment and two Siemens equipment, being used a non-anthropomorphic scatterer. The exposure measurements in mesh of 4.20 x 4.20 cubic centimeters, at a half-height of the simulator and steps of 30 cm, were used by using the radiographic techniques: 100 k Vp and 63 m As (Shimadzu) and 96 k Vp and 40 m As (Siemens). For estimation of environmental equivalent dose, during 12 months, were considered: 3.55 m As/examination and 44.5 procedures/month (adults): and 3.16 m As/examination and 20.1 procedures/month (pediatrics). It was observed that only the values in the distance of 60 cm presented over the maximum limit of environment equivalent dose defined for Free Area (0.5 mSv/year). The points collected at 2.1 m from the primary beam center, have shown to be always 12% of referred limit, shown to be a safe distance for the hospitalized patients

  5. Estimation of the ignition probability due to mechanically generated impact sparks in explosive gas/air-mixtures. Examinations of the materials combination: steel/steel; Ermittlung der Zuendwahrscheinlichkeit mechanisch erzeugter Schlagfunken in explosionsfaehigen Brenngas/Luft-Gemischen. Untersuchung der Werkstoffkombination Stahl/Stahl

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, T.; Graetz, R.

    2007-09-29

    Equipment intended for use in potentially explosive atmospheres must meet the requirements of the European directive 94/9/EC. The declaration of conformity of the manufacturer testifies that they meet the requirements. The conformity assessment is based on the risk (ignition) assessment which identifies and estimates the ignition sources. The European standards in the area of the directive 94/9/EC (like EN 1127-1, EN 13463-1) describe 13 possible ignition sources. Mechanically generated sparks are one of them. Statements to the ignition effectiveness and especially the ignition probability in case of mechanically generated sparks for a given kinetic impact energy and given explosive gas/air-mixtures are not possible. An extensive literature looking confirms this state. This was and is a problem in making and revising standards. Simple ferritic steel is a common material for the construction of equipment also for non electrical applications intended for use in potentially explosive atmospheres for chemical and mechanical engineering and manufacturing technology. Therefore it was the objective of this study to get some statistical ignition probabilities depending on the kinetic impact energy and the minimum ignition energy of the explosive gas/air-mixture. This study was made with impact testing machines of BAM (Federal Institute of Materials Research and Testing) at three kinetic impact energies. The following results were obtained for all the reference gas/air-mixtures of the IEC-explosion groups (I methane, IIA propane, IIB ethylene, IIC acetylene, hydrogen): 1. It was not possible to generate ignitable mechanically sparks for kinetic impact energies below 3 Nm for the test conditions in this study respectively the impact kinetics and impact geometry of the impact machines. 2. Single mechanically generated particles were able to be a dangerous ignition source through oxidation process at kinetic impact energies of 10 Nm. Furthermore the tests have shown that the

  6. Probability distributions with summary graph structure

    CERN Document Server

    Wermuth, Nanny

    2010-01-01

    A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class are multivariate regression chain graphs. They describe the independences of stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that then result after possible marginalising or conditioning, we use summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences which remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to de...

  7. A Unified Algorithm for Finding the Intersection Curve of Surfaces

    Institute of Scientific and Technical Information of China (English)

    谭建荣; 郑建民; 等

    1994-01-01

    In this papaer,an INTEGRAL CURVE ALGORITHM is presented,which turns the intersection curve of surfaces into the form of integral one and then uses “PREDICTORCORRECTOR” technique to evaluate the intersection of surfaces.No matter how the surfaces are defined,the method always deals with the intersection curves in the same way.To find a point on the curve one need only to calculate the JACOBI determinants of “PREDICTOR point”and “CORRECTOR point” while the second order precision is guatanteed.Thus,not only is the problem of finding the intersection of surfaces resolved,but also the algorithms for generating both plane curve and space curve are unified.

  8. Reflection of curved shock waves

    Science.gov (United States)

    Mölder, S.

    2017-03-01

    Shock curvatures are related to pressure gradients, streamline curvatures and vorticity in flows with planar and axial symmetry. Explicit expressions, in an influence coefficient format, are used to relate post-shock pressure gradient, streamline curvature and vorticity to pre-shock gradients and shock curvature in steady flow. Using higher order, von Neumann-type, compatibility conditions, curved shock theory is applied to calculate the flow near singly and doubly curved shocks on curved surfaces, in regular shock reflection and in Mach reflection. Theoretical curved shock shapes are in good agreement with computational fluid dynamics calculations and experiment.

  9. Reflection of curved shock waves

    Science.gov (United States)

    Mölder, S.

    2017-09-01

    Shock curvatures are related to pressure gradients, streamline curvatures and vorticity in flows with planar and axial symmetry. Explicit expressions, in an influence coefficient format, are used to relate post-shock pressure gradient, streamline curvature and vorticity to pre-shock gradients and shock curvature in steady flow. Using higher order, von Neumann-type, compatibility conditions, curved shock theory is applied to calculate the flow near singly and doubly curved shocks on curved surfaces, in regular shock reflection and in Mach reflection. Theoretical curved shock shapes are in good agreement with computational fluid dynamics calculations and experiment.

  10. Heegner modules and elliptic curves

    CERN Document Server

    Brown, Martin L

    2004-01-01

    Heegner points on both modular curves and elliptic curves over global fields of any characteristic form the topic of this research monograph. The Heegner module of an elliptic curve is an original concept introduced in this text. The computation of the cohomology of the Heegner module is the main technical result and is applied to prove the Tate conjecture for a class of elliptic surfaces over finite fields; this conjecture is equivalent to the Birch and Swinnerton-Dyer conjecture for the corresponding elliptic curves over global fields.

  11. Closed planar curves without inflections

    CERN Document Server

    Ohno, Shuntaro; Umehara, Masaaki

    2011-01-01

    We define a computable topological invariant $\\mu(\\gamma)$ for generic closed planar regular curves $\\gamma$, which gives an effective lower bound for the number of inflection points on a given generic closed planar curve. Using it, we classify the topological types of locally convex curves (i.e. closed planar regular curves without inflections) whose numbers of crossings are less than or equal to five. Moreover, we discuss the relationship between the number of double tangents and the invariant $\\mu(\\gamma)$ on a given $\\gamma$.

  12. Optimal Reliability-Based Planning of Experiments for POD Curves

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.

    Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First...

  13. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  14. Conventional flow curves of liquid cast iron put on spheroidization

    Directory of Open Access Journals (Sweden)

    B. Borowiecki

    2008-04-01

    Full Text Available The purpose of the investigation was to confirm the hypothesis that the conventional flow curves of liquid cast iron put on sferoidization determined from the rod fluidity test are comparable to flow curves of liquids in environmental temperature. Moreover has been identified, that conventional flow curves for this liquid cast iron are similar to generalized non- Newtonian liquids curves.For rods with the diameters 3-8 mm there are three various curves:1 – the flow curve of liquid cast iron put on spheroidization overheated about 80 K resemble a shape adequately to a curve of densified liquid with shearing. This phenomenon can be caused by high overcooled and creation of crystallization nuclei;2 – metal alloys overheated about 180 K resemble a shape adequately to Newtonian liquid;3 – metal alloys overheated about 210 K resemble a shape of curve adequately to dispersed liquid with shearing. This phenomenon probably depends on influence of gas which creates on boundary of metal-sand mould.

  15. 关联及拟合:建筑功能生命周期曲线生成方法——以成都火车北站为例%Correlation and Fitting:Generating Method of Life Cycle Curve About Building Function: Taking Chengdu Northern Train Station as an Example

    Institute of Scientific and Technical Information of China (English)

    顾红男; 钱中源

    2013-01-01

    Based on the research of architecture sustainable development strategy of life cycle theory,the life cycle curve is described based tool,anticipation and guiding role.Generation of the life cycle curve,is one of the important steps and key technology of the whole research process.Based on the life cycle theory,we put forward the concept of life cycle on building function.Taking Chengdu northern train station as a case,an empirical study on the method for generating life cycle curve about building function,we draw out three conclusions:the correlation of index screening; train station of city life cycle index function for the per capita occupancy area; Using polynomial and exponential algorithm in Matlab software platform for high order,calculation,is the effective mathematical tool for generating function of building life cycle Curve.%在基于生命周期理论的建筑可持续发展策略研究中,生命周期曲线是具有描述作用、预判作用和指导作用的基础工具.生命周期曲线的生成是整体研究过程中的重要步骤和关键技术之一.该文基于生命周期理论的内?,提出了建筑的功能生命周期这一概念.以成都火车北站为案例,实证研究了建筑功能生命周期曲线的生成方法.得到三个结论:以关联度筛选辨析指标;城市火车站功能生命周期的辨析指标为人均占用面积;以Matlab软件平台中多项式及指数算法,进行高阶次计算,是生成建筑功能生命周期曲线的有效数学工具.

  16. Statistics for traces of cyclic trigonal curves over finite fields

    CERN Document Server

    Bucur, Alina; Feigon, Brooke; Lalín, Matilde

    2009-01-01

    In this paper we study the variation of the trace of the Frobenius endomorphism associated to a cyclic trigonal curve of genus g over a field of q elements as the curve varies in an irreducible component of the moduli space. We show that for q fixed and g increasing, the limiting distribution of the trace of the Frobenius endomorphism is equal to the sum of q+1 independent random variables taking the value 0 with probability 2/(q+2) and taking the values 1, e^{(2pi i)/3}, e^{(4pi i)/3} with probability q/(3(q+2)). This extends the work of Kurlberg and Rudnick who considered the same limit for the case of hyperelliptic curves. We also show that when both the genus and q go to infinity, the normalized trace has a complex Gaussian distribution with mean 0 and variance 1.

  17. Method for estimating spin-spin interactions from magnetization curves

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2017-02-01

    We develop a method to estimate the spin-spin interactions in the Hamiltonian from the observed magnetization curve by machine learning based on Bayesian inference. In our method, plausible spin-spin interactions are determined by maximizing the posterior distribution, which is the conditional probability of the spin-spin interactions in the Hamiltonian for a given magnetization curve with observation noise. The conditional probability is obtained with the Markov chain Monte Carlo simulations combined with an exchange Monte Carlo method. The efficiency of our method is tested using synthetic magnetization curve data, and the results show that spin-spin interactions are estimated with a high accuracy. In particular, the relevant terms of the spin-spin interactions are successfully selected from the redundant interaction candidates by the l1 regularization in the prior distribution.

  18. Volcano shapes, entropies, and eruption probabilities

    Science.gov (United States)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  19. Cluster Membership Probability: Polarimetric Approach

    CERN Document Server

    Medhi, Biman J

    2013-01-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...

  20. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  1. GHG emissions, GDP growth and the Kyoto Protocol: A revisit of Environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Wei Ming; Lee, Grace W.M. [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106 (China); Wu, Chih Cheng [Energy and Air Pollution Control Section, New Materials R and D Department, China Steel Corporation, 1, Chung-Kang Road, Siaogang District, Kaohsiung 81233 (China)

    2008-01-15

    The Kyoto Protocol attempts through political negotiations to guide participating industrialized countries' greenhouse gas (GHG) emissions from a positive growing trend, to reach a peak point (or turning point), and then be reduced to a negative growth. That means the relationship between decreasing GHG emissions and economic growth may be described by an inverted-U curve (or called a bell-shaped curve), which is consistent with the concept of the Environmental Kuznets Curve (EKC) hypothesis. This research observed that the economic development and GHG emissions in Economies in Transition (EITs) exhibit a hockey-stick curve trend (or called quasi-L-shape curve), that also generates a lot of 'hot air' which is significant to the implementation of the Kyoto Protocol. In addition, through the analysis of single-country time series data and GDP data, this research demonstrated that statistical data for most of the Annex II countries do not possess evidence that supports the EKC hypothesis for GHG emissions. The results from this study also indicated that the 38 industrialized countries are unable to meet their targets under the Kyoto Protocol within the specified time period, which are probably caused by the econometric method's inability to predict accurately the extents and development of innovative technologies and Clean Development Mechanism (CDM) projects. If the international community truly wants to reduce the GHG emissions, the effectiveness of the existing international framework for emissions reduction needs to be reconsidered seriously, and the global cooperation mechanism also needs to be greatly enhanced. (author)

  2. GHG emissions, GDP growth and the Kyoto Protocol: A revisit of Environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Huang Weiming [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China); Lee, Grace W.M. [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China)], E-mail: gracelee@ntu.edu.tw; Wu Chihcheng [Energy and Air Pollution Control Section, New Materials R and D Department, China Steel Corporation, 1, Chung-Kang Road, Siaogang District, Kaohsiung 81233, Taiwan (China)

    2008-01-15

    The Kyoto Protocol attempts through political negotiations to guide participating industrialized countries' greenhouse gas (GHG) emissions from a positive growing trend, to reach a peak point (or turning point), and then be reduced to a negative growth. That means the relationship between decreasing GHG emissions and economic growth may be described by an inverted-U curve (or called a bell-shaped curve), which is consistent with the concept of the Environmental Kuznets Curve (EKC) hypothesis. This research observed that the economic development and GHG emissions in Economies in Transition (EITs) exhibit a hockey-stick curve trend (or called quasi-L-shape curve), that also generates a lot of 'hot air' which is significant to the implementation of the Kyoto Protocol. In addition, through the analysis of single-country time series data and GDP data, this research demonstrated that statistical data for most of the Annex II countries do not possess evidence that supports the EKC hypothesis for GHG emissions. The results from this study also indicated that the 38 industrialized countries are unable to meet their targets under the Kyoto Protocol within the specified time period, which are probably caused by the econometric method's inability to predict accurately the extents and development of innovative technologies and Clean Development Mechanism (CDM) projects. If the international community truly wants to reduce the GHG emissions, the effectiveness of the existing international framework for emissions reduction needs to be reconsidered seriously, and the global cooperation mechanism also needs to be greatly enhanced.

  3. Migration and the Wage Curve:

    DEFF Research Database (Denmark)

    Brücker, Herbert; Jahn, Elke J.

      Based on a wage curve approach we examine the labor market effects of migration in Germany. The wage curve relies on the assumption that wages respond to a change in the unemployment rate, albeit imperfectly. This allows one to derive the wage and employment effects of migration simultaneously...

  4. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  5. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  6. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  7. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  8. NURBS curve blending using extension

    Institute of Scientific and Technical Information of China (English)

    Yong-jin LIU; Rong-qi QIU; Xiao-hui LIANG

    2009-01-01

    Curve and surface blending is an important operation in CAD systems, in which a non-uniform rational B-spline (NURBS) has been used as the de facto standard. In local comer blending, two curves intersecting at that comer are first made disjoint, and then the third blending curve is added-in to smoothly join the two curves with G1-or G2-continuity. In this paper we present a study to solve the joint problem based on curve extension. The following nice properties of this extension algorithm are exploited in depth: (1) The parameterization of the original shapes does not change; (2) No additional fragments are created.Various examples are presented to demonstrate that our solution is simple and efficient.

  9. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  10. Innovation and social probable knowledge

    OpenAIRE

    Marco Crocco

    2000-01-01

    In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...

  11. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  12. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  13. The Influence of the Annual Number of Storms on the Derivation of the Flood Frequency Curve through Event-Based Simulation

    Directory of Open Access Journals (Sweden)

    Alvaro Sordo-Ward

    2016-08-01

    Full Text Available This study addresses the question of how to select the minimum set of storms that should be simulated each year in order to estimate an accurate flood frequency curve for return periods ranging between 1 and 1000 years. The Manzanares basin (Spain was used as a study case. A continuous 100,000-year hourly rainfall series was generated using the stochastic spatial–temporal model RanSimV3. Individual storms were extracted from the series by applying the exponential method. For each year, the extracted storms were transformed into hydrographs by applying an hourly time-step semi-distributed event-based rainfall–runoff model, and the maximum peak flow per year was determined to generate the reference flood frequency curve. Then, different flood frequency curves were obtained considering the N storms with maximum rainfall depth per year, with 1 ≤ N ≤ total number of storms. Main results show that: (a the degree of alignment between the calculated flood frequency curves and the reference flood frequency curve depends on the return period considered, increasing the accuracy for higher return periods; (b for the analyzed case studies, the flood frequency curve for medium and high return period (50 ≤ return period ≤ 1000 years can be estimated with a difference lower than 3% (compared to the reference flood frequency curve by considering the three storms with the maximum total rainfall depth each year; (c when considering only the greatest storm of the year, for return periods higher than 10 years, the difference for the estimation of the flood frequency curve is lower than 10%; and (d when considering the three greatest storms each year, for return periods higher than 100 years, the probability of achieving simultaneously a hydrograph with the annual maximum peak flow and the maximum volume is 94%.

  14. Galaxy rotation curves with log-normal density distribution

    CERN Document Server

    Marr, John H

    2015-01-01

    The log-normal distribution represents the probability of finding randomly distributed particles in a micro canonical ensemble with high entropy. To a first approximation, a modified form of this distribution with a truncated termination may represent an isolated galactic disk, and this disk density distribution model was therefore run to give the best fit to the observational rotation curves for 37 representative galaxies. The resultant curves closely matched the observational data for a wide range of velocity profiles and galaxy types with rising, flat or descending curves in agreement with Verheijen's classification of 'R', 'F' and 'D' type curves, and the corresponding theoretical total disk masses could be fitted to a baryonic Tully Fisher relation (bTFR). Nine of the galaxies were matched to galaxies with previously published masses, suggesting a mean excess dynamic disk mass of dex0.61+/-0.26 over the baryonic masses. Although questionable with regard to other measurements of the shape of disk galaxy g...

  15. Multivariate normally distributed biomarkers subject to limits of detection and receiver operating characteristic curve inference.

    Science.gov (United States)

    Perkins, Neil J; Schisterman, Enrique F; Vexler, Albert

    2013-07-01

    Biomarkers are of ever-increasing importance to clinical practice and epidemiologic research. Multiple biomarkers are often measured per patient. Measurement of true biomarker levels is limited by laboratory precision, specifically measuring relatively low, or high, biomarker levels resulting in undetectable levels below, or above, a limit of detection (LOD). Ignoring these missing observations or replacing them with a constant are methods commonly used although they have been shown to lead to biased estimates of several parameters of interest, including the area under the receiver operating characteristic (ROC) curve and regression coefficients. We developed asymptotically consistent, efficient estimators, via maximum likelihood techniques, for the mean vector and covariance matrix of multivariate normally distributed biomarkers affected by LOD. We also developed an approximation for the Fisher information and covariance matrix for our maximum likelihood estimations (MLEs). We apply these results to an ROC curve setting, generating an MLE for the area under the curve for the best linear combination of multiple biomarkers and accompanying confidence interval. Point and confidence interval estimates are scrutinized by simulation study, with bias and root mean square error and coverage probability, respectively, displaying behavior consistent with MLEs. An example using three polychlorinated biphenyls to classify women with and without endometriosis illustrates how the underlying distribution of multiple biomarkers with LOD can be assessed and display increased discriminatory ability over naïve methods. Properly addressing LODs can lead to optimal biomarker combinations with increased discriminatory ability that may have been ignored because of measurement obstacles. Published by Elsevier Inc.

  16. Improved neck injury risk curves for tension and extension moment measurements of crash dummies.

    Science.gov (United States)

    Mertz, H J; Prasad, P

    2000-11-01

    This paper describes improvements made to the injury risk curves for peak neck tension, peak neck extension moment and a linear combination of tension and extension moment that produce peak stress in the anterior-longitudinal ligament at the head-to-neck junction. Data from previously published experiments that correlated neck injuries to 10-week-old, anesthetized pigs and neck response measurements of a 3-year-old child dummy that were subjected to similar airbag deployments are updated and used to generate Normal probability curves for the risk of AIS >/= 3 neck injury for the 3-year-old child. These curves are extended to other sizes and ages by normalizing for neck size. Factors for percent of muscle tone and ligamentous failure stress as a function of age are incorporated in the risk analysis. The most sensitive predictor of AIS > 3 neck injury for this data set is peak neck tension. If two possible outliers are deleted from the data set, then the combined criterion of extension moment and axial force becomes the most sensitive predictor which is consistent with expectations.

  17. 基于 NURBS模型的自由曲线加工轨迹自适应生成方法%An adaptive generation method for free curve trajectory based on NURBS

    Institute of Scientific and Technical Information of China (English)

    朱昊; 刘京南; 杨安康; 汪木兰

    2014-01-01

    为实现NURBS曲面快速高精度实时差补,提出了基于修正型sigmoid函数的动力学模型,给出了最大速度、弓高误差、加工曲线的曲率半径和插补周期之间的约束条件。该模型在满足jerk、加速度、速度均连续的前提下,将常用的三次多项式S型以及三角多项式S型动力学模型的15个分段数减少至3个。在此基础上,提出采用差商代替导数的优化Adams算法,避免了常用的Taylor展开所遇到的高阶求导计算,求取了差补周期参数。最后通过减少低次零值B样条基函数的计算,对De Boor-Cox递推算法进行了简化设计,提出了精简型De Boor-Cox算法,缩减了计算量。仿真分析表明,所提算法可根据加工路径有效控制进给速度,在保证加工精度的同时,使计算量得到减少,提高了运算速度。实验结果显示本加工方法可以正确计算目标参数,并适合应用于实际加工系统。%To realize the high precision and real-time interpolation of the NURBS non-uniform rational B-spline curve a kinetic model based on the modified sigmoid function is proposed.The constraints of maximum feed rate chord error curvature radius and interpolator cycle are discussed. This kinetic model reduces the cubic polynomial S-shape model and the trigonometry function S-shape model from 15 sections into 3 sections under the precondition of jerk acceleration and feedrate continuity. Then an optimized Adams algorithm using the difference quotient to replace the derivative is presented to calculate the interpolator cycle parameters. The higher-order derivation in the Taylor expansion algorithm can be avoided by this algorithm. Finally the simplified design is analyzed by reducing the times of computing the low-degree zero-value B-spline basis function and the simplified De Boor-Cox recursive algorithm is proposed.The simulation analysis indicates that by these algorithms the feed rate is effectively

  18. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  19. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  20. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  1. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  2. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  3. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  4. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...

  5. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  6. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  7. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  8. Supervised detection of anomalous light curves in massive astronomical catalogs

    Energy Technology Data Exchange (ETDEWEB)

    Nun, Isadora; Pichara, Karim [Computer Science Department, Pontificia Universidad Católica de Chile, Santiago (Chile); Protopapas, Pavlos [Institute for Applied Computational Science, Harvard University, Cambridge, MA (United States); Kim, Dae-Won [Max-Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany)

    2014-09-20

    The development of synoptic sky surveys has led to a massive amount of data for which resources needed for analysis are beyond human capabilities. In order to process this information and to extract all possible knowledge, machine learning techniques become necessary. Here we present a new methodology to automatically discover unknown variable objects in large astronomical catalogs. With the aim of taking full advantage of all information we have about known objects, our method is based on a supervised algorithm. In particular, we train a random forest classifier using known variability classes of objects and obtain votes for each of the objects in the training set. We then model this voting distribution with a Bayesian network and obtain the joint voting distribution among the training objects. Consequently, an unknown object is considered as an outlier insofar it has a low joint probability. By leaving out one of the classes on the training set, we perform a validity test and show that when the random forest classifier attempts to classify unknown light curves (the class left out), it votes with an unusual distribution among the classes. This rare voting is detected by the Bayesian network and expressed as a low joint probability. Our method is suitable for exploring massive data sets given that the training process is performed offline. We tested our algorithm on 20 million light curves from the MACHO catalog and generated a list of anomalous candidates. After analysis, we divided the candidates into two main classes of outliers: artifacts and intrinsic outliers. Artifacts were principally due to air mass variation, seasonal variation, bad calibration, or instrumental errors and were consequently removed from our outlier list and added to the training set. After retraining, we selected about 4000 objects, which we passed to a post-analysis stage by performing a cross-match with all publicly available catalogs. Within these candidates we identified certain known

  9. Degrees of Curves in Abelian Varieties

    CERN Document Server

    Debarre, O

    1992-01-01

    The degree of a curve $C$ in a polarized abelian variety $(X,\\lambda)$ is the integer $d=C\\cdot\\lambda$. When $C$ generates $X$, we find a lower bound on $d$ which depends on $n$ and the degree of the polarization $\\lambda$. The smallest possible degree is $d=n$ and is obtained only for a smooth curve in its Jacobian with its principal polarization (Ran, Collino). The cases $d=n+1$ and $d=n+2$ are studied. Moreover, when $X$ is simple, it is shown, using results of Smyth on the trace of totally positive algebraic integers, that if $d\\le 1.7719\\, n$, then $C$ is smooth and $X$ is isomorphic to its Jacobian. We also get an upper bound on the geometric genus of $C$ in terms of its degree.

  10. Equations of hyperelliptic Shimura curves

    CERN Document Server

    Molina, Santiago

    2010-01-01

    We describe an algorithm that computes explicit models of hyperelliptic Shimura curves attached to an indefnite quaternion algebra over Q and Atkin-Lehner quotients of them. It exploits Cerednik-Drinfeld's non-archimedean uniformisation of Shimura curves, a formula of Gross and Zagier for the endomorphism ring of Heegner points over Artinian rings and the connection between Ribet's bimodules and the specialization of Heegner points. As an application, we provide a list of equations of Shimura curves and quotients of them obtained by our algorithm that had been conjectured by Kurihara.

  11. Poiseuille flow in curved spaces

    CERN Document Server

    Debus, J -D; Succi, S; Herrmann, H J

    2015-01-01

    We investigate Poiseuille channel flow through intrinsically curved (campylotic) media, equipped with localized metric perturbations (campylons). To this end, we study the flux of a fluid driven through the curved channel in dependence of the spatial deformation, characterized by the campylon parameters (amplitude, range and density). We find that the flux depends only on a specific combination of campylon parameters, which we identify as the average campylon strength, and derive a universal flux law for the Poiseuille flow. For the purpose of this study, we have improved and validated our recently developed lattice Boltzmann model in curved space by considerably reducing discrete lattice effects.

  12. Learning Curves in Robotic Rectal Cancer Surgery: A literature Review

    Directory of Open Access Journals (Sweden)

    Nasir

    2016-10-01

    Full Text Available Background Laparoscopic rectal cancer surgery offers several advantages over open surgery, including quicker recovery, shorter hospital stay and improved cosmesis. However, laparoscopic rectal surgery is technically difficult and is associated with a long learning curve. The last decade has seen the emergence of robotic rectal cancer surgery. In contrast to laparoscopy, robotic surgery offers stable 3D views with advanced dexterity and ergonomics in narrow spaces such as the pelvis. Whether this translates into a shorter learning curve is still debated. The aim of this literature search is to ascertain the learning curve of robotic rectal cancer surgery. Methods This review analyses the literature investigating the learning curve of robotic rectal cancer surgery. Using the Medline database a literature search of articles investigating the learning curve of robotic rectal surgery was performed. All relevant articles were included. Results Twelve original studies fulfilled the inclusion criteria. The current literature suggests that the learning curve of robotic rectal surgery varies between 15 and 44 cases and is probably shorter to that of laparoscopic rectal surgery. Conclusions There are only a few studies assessing the learning curve of robotic rectal surgery and they possess several differences in methodology and outcome reporting. Nevertheless, current evidence suggests that robotic rectal surgery might be easier to learn than laparoscopy. Further well designed studies applying CUSSUM analysis are required to validate this motion.

  13. Discrimination of Natural Fractures Using Well Logging Curve Unit

    Institute of Scientific and Technical Information of China (English)

    Liu Hongqi; Peng Shimi; Zhou Yongyi; Xue Yongchao

    2004-01-01

    It is very difficult to discriminate natural fractures using conventional well log data, especially for most of the matured oilfields in China, because the raw data were acquired with relatively obsolete tools. The raw data include only GR and SP curves, indicative of lithology, AC curves, used to calculate the porosity of the formation, and a set of logging curves from various electrode length resistivity by laterolog. On the other hand, these oilfields usually have a large amount of core data which directly display the characteristics of the formation, and enough information of injection and production. This paper describes an approach through which logging curves are calibrated in terms of the raw data, and then a prototype model of natural fractures is established based on the investigation of core data from 43 wells, totaling 4 000 m in length. A computer program has been developed according to this method. Through analysis and comparison of the features of logging curves, this paper proposes a new concept, the well logging curve unit. By strictly depicting its shape through mathematical methods, the natural facture can be discriminated. This work also suggests an equation to estimate the probability of fracture occurrence, and finally other fracture parameters are calculated using some experimental expressions. With this methodology, logging curves from 100 wells were interpreted, the results of which agree with core data and field information.

  14. Leptogenesis from loop effects in curved spacetime

    CERN Document Server

    McDonald, Jamie I

    2015-01-01

    We describe a new mechanism -- radiatively-induced gravitational leptogenesis -- for generating the matter-antimatter asymmetry of the Universe. We show how quantum loop effects in C and CP violating theories cause matter and antimatter to propagate differently in the presence of gravity, and prove this is forbidden in flat space by CPT and translation symmetry. This generates a curvature-dependent chemical potential for leptons, allowing a matter-antimatter asymmetry to be generated in thermal equilibrium in the early Universe. The time-dependent dynamics necessary for leptogenesis is provided by the interaction of the virtual self-energy cloud of the leptons with the expanding curved spacetime background, which violates the strong equivalence principle and allows a distinction between matter and antimatter. We show here how this mechanism is realised in a particular BSM theory, the see-saw model, where the quantum loops involve the heavy sterile neutrinos responsible for light neutrino masses. We demonstrat...

  15. More Unusual Light Curves from Kepler

    Science.gov (United States)

    Kohler, Susanna

    2017-03-01

    -main-sequence stars ever obtained.In these light curves, Stauffer and collaborators found a set of 23 very low-mass, mid-to-late-type M dwarfs with unusual variability in their light curves. The variability is consistent with the stars rotation period where measured which suggests that whatever causes the dips in the light curve, its orbiting at the same rate as the star spins.Causes of Variability?These plots show how the properties of these 23 stars compare to those of the rest of the stars in their cluster (click for a closer look!). For all but the rotation rate, they are typical. But the stars with scallop-shaped light curves have among the shortest periods in Upper Sco, with somenear the theoretical break-up for stars of their age. [Stauffer et al. 2017]The authors categorize the 23 stars into two main groups.The first group consists of 19 stars with short periods; more than half of them rotate within a factor of two of their predicted breakup period! Many of these show sudden changes in their light-curve morphology, often after a stellar flare. The authors propose that the variability in these light curves might be caused by warm coronal gas clouds that are organized into a structured toroidal shape around the star.The second group consists of the remaining four stars, which have slightly longer periods. The light curves show a single short-duration flux dip with highly variable depth and shape superposed on normal, spotted-star light curves. The authors best guess for these four stars is that there are clouds of dusty debris circling the star, possibly orbiting a close-in planet or resulting from a recent collisional event.Stauffer and collaborators are currently developing more detailed models for these stars based on the possible variability scenarios. The next step, they state, is to determine if the gas in these structures have properties necessary to generate the light-curve features we see.CitationJohn Stauffer et al 2017 AJ 153 152. doi:10.3847/1538-3881/aa5eb9

  16. Modular forms and special cycles on Shimura curves (AM-161)

    CERN Document Server

    Kudla, Stephen S; Yang, Tonghai

    2006-01-01

    Modular Forms and Special Cycles on Shimura Curves is a thorough study of the generating functions constructed from special cycles, both divisors and zero-cycles, on the arithmetic surface ""M"" attached to a Shimura curve ""M"" over the field of rational numbers. These generating functions are shown to be the q-expansions of modular forms and Siegel modular forms of genus two respectively, valued in the Gillet-Soulé arithmetic Chow groups of ""M"". The two types of generating functions are related via an arithmetic inner product formula. In addition, an analogue of the classical Siegel-Weil

  17. Understanding Y haplotype matching probability.

    Science.gov (United States)

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  18. Single universal curve for Alpha decay derived from semi-microscopic calculations

    OpenAIRE

    Ismail, M.; Seif, W. M.; Ellithi, A. Y.; Abdurrahman, A

    2015-01-01

    The universal curve is one of the simple ways to get preliminary information about the Alpha-decay half-life times of heavy nuclei. We try to find parameterization for the universal curve of Alpha decay based on semi-microscopic calculations, starting from the realistic M3Y-Reid nucleon-nucleon interaction. Within the deformed density-dependent cluster model, the penetration probability and the assault frequency are calculated using the WKB penetration probability. The deformations of daughte...

  19. Description of dose response curve

    OpenAIRE

    Al-Samarai, Firas

    2011-01-01

    The book included several methods to estimate LD50, in addition to explain how to use several programs to estimate LD50. Moreover the book illustrate the description of the dose response curves. Firas Al-Samarai

  20. String networks as tropical curves

    CERN Document Server

    Ray, Koushik

    2008-01-01

    A prescription for obtaining supergravity solutions for planar (p,q)-string networks is presented, based on earlier results. It shows that networks may be looked upon as tropical curves emerging as the spine of the amoeba of a holomorphic curve in M-theory. The Kaehler potential of supergravity is identified with the corresponding Ronkin function. Implications of this identification in counting dyons is discussed.

  1. Growth curves for Laron syndrome.

    OpenAIRE

    Laron, Z; Lilos, P; Klinger, B.

    1993-01-01

    Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls co...

  2. Flow over riblet curved surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Loureiro, J B R; Freire, A P Silva, E-mail: atila@mecanica.ufrj.br [Mechanical Engineering Program, Federal University of Rio de Janeiro (COPPE/UFRJ), C.P. 68503, 21.941-972, Rio de Janeiro, RJ (Brazil)

    2011-12-22

    The present work studies the mechanics of turbulent drag reduction over curved surfaces by riblets. The effects of surface modification on flow separation over steep and smooth curved surfaces are investigated. Four types of two-dimensional surfaces are studied based on the morphometric parameters that describe the body of a blue whale. Local measurements of mean velocity and turbulence profiles are obtained through laser Doppler anemometry (LDA) and particle image velocimetry (PIV).

  3. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  4. Cluster pre-existence probability

    Energy Technology Data Exchange (ETDEWEB)

    Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)

    2011-10-15

    Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)

  5. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  6. Sm Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Sneden, C; Cowan, J J

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).

  7. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  8. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  9. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  10. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ

  11. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  12. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  13. Objective probability and quantum fuzziness

    CERN Document Server

    Mohrhoff, U

    2007-01-01

    This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...

  14. Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems

    Directory of Open Access Journals (Sweden)

    Xiaobo Zhao

    2015-12-01

    Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.

  15. Automatic acquisition and computation of data from the radiometer dissociation curve analyzer

    NARCIS (Netherlands)

    Maas, A.H.J.; Hamelink, M.L.; Poelgeest, R. van; Zuidema, P.; Kraan, K.J.; Camp, R. van den; Leeuw, R.J.M. de

    1974-01-01

    The whole oxyhemoglobin dissociation curve of blood can be generated using the Radiometer Dissociation Curve Analyzer (model DCA-1), which measures oxygen pressure, oxygen content and pH simultaneously. Thereby the dissociation curve at standard physiological conditions of pH 7.4 and temperature 37°

  16. On the speciality of a curve

    Directory of Open Access Journals (Sweden)

    Rosario Strano

    1993-05-01

    Full Text Available Let C ⊂ P rk, k algebraically closed field of characteristic 0, be a curve and let e(C={max n | H1(OC(n≠0} its speciality. Let Γ be the generic hyperplane section and ε ={max n | H 1(IΓ(n≠0}. We prove that, if Γ is generated in degree ≤ ε , then e(C=ε -1. In the case r=3 we discuss some relations between e(C and the Hilbert function of Γ.

  17. A Degeneracy in DRW Modelling of AGN Light Curves

    CERN Document Server

    Kozlowski, Szymon

    2016-01-01

    Individual light curves of active galactic nuclei (AGNs) are nowadays successfully modelled with the damped random walk (DRW) stochastic process, characterized by the power exponential covariance matrix of the signal, with the power $\\beta=1$. By Monte Carlo simulation means, we generate mock AGN light curves described by non-DRW stochastic processes ($0.5\\leq\\beta\\leq 1.5$ and $\\beta\

  18. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  19. Shape optimization of self-avoiding curves

    Science.gov (United States)

    Walker, Shawn W.

    2016-04-01

    This paper presents a softened notion of proximity (or self-avoidance) for curves. We then derive a sensitivity result, based on shape differential calculus, for the proximity. This is combined with a gradient-based optimization approach to compute three-dimensional, parameterized curves that minimize the sum of an elastic (bending) energy and a proximity energy that maintains self-avoidance by a penalization technique. Minimizers are computed by a sequential-quadratic-programming (SQP) method where the bending energy and proximity energy are approximated by a finite element method. We then apply this method to two problems. First, we simulate adsorbed polymer strands that are constrained to be bound to a surface and be (locally) inextensible. This is a basic model of semi-flexible polymers adsorbed onto a surface (a current topic in material science). Several examples of minimizing curve shapes on a variety of surfaces are shown. An advantage of the method is that it can be much faster than using molecular dynamics for simulating polymer strands on surfaces. Second, we apply our proximity penalization to the computation of ideal knots. We present a heuristic scheme, utilizing the SQP method above, for minimizing rope-length and apply it in the case of the trefoil knot. Applications of this method could be for generating good initial guesses to a more accurate (but expensive) knot-tightening algorithm.

  20. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun

    2010-09-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying completely on the surfaces by using iso-parameter curves of the reparameterized surfaces. The Hausdorff distance between the projected curve and the original curve is controlled under the user-specified distance tolerance. The projected curve is T-G 1 continuous, where T is the user-specified angle tolerance. Examples are given to show the performance of our algorithm. © 2010 Elsevier Inc. All rights reserved.

  1. Curve Digitizer – A software for multiple curves digitizing

    Directory of Open Access Journals (Sweden)

    Florentin ŞPERLEA

    2010-06-01

    Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document

  2. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  3. Hydrogeologic unit flow characterization using transition probability geostatistics.

    Science.gov (United States)

    Jones, Norman L; Walker, Justin R; Carle, Steven F

    2005-01-01

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.

  4. COMMENT: Probability, belief and success rate: comments on 'On the meaning of coverage probabilities'

    Science.gov (United States)

    Willink, R.

    2010-06-01

    The method of uncertainty evaluation discussed in Supplement 1 to the Guide to the Expression of Uncertainty in Measurement generates a coverage interval in which the measurand is said to have a certain probability (the coverage probability) of lying. This communication contains a response to the recent claim that 'when a coverage interval summarizes the resulting state of knowledge, the coverage probability should not be interpreted as a relative frequency of successful intervals in a large series of imagined or simulated intervals' (Lira 2009 Metrologia 46 616-8). First, Bernoulli's law of large numbers is used to prove that the long-run success rate of a methodology used to calculate 95% coverage intervals must be 95%. Second, the usual definition of subjective probability or 'degree of belief' is stated, and the weak law of large numbers is then used to show that this definition—and the corresponding definition of 'state of knowledge'—relies on the concept of long-run behaviour. This provides an alternative proof of the same result.

  5. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  6. Determination of the {sup 121}Te gamma emission probabilities associated with the production process of radiopharmaceutical NaI[{sup 123}I

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, M.T.F.; Lopes, R.T., E-mail: maraujo@con.ufrj.br, E-mail: miriamtaina@hotmail.com [Coordenacao dos Cursos de Pos-Graduacao em Engenharia (LIN/PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear. Lab. de Instrumentacao Nuclear; Poledna, R.; Delgado, J.U.; Almeida, M.C.M. de; Silva, R.L. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ/LNMRI), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes

    2015-07-01

    The {sup 123}I is widely used in radiodiagnostic procedures in nuclear medicine. According to Pharmacopoeia care should be taken during its production process, since radionuclidic impurities may be generated. The {sup 121}Te is an impurity that arises during the {sup 123}I production and determining their gamma emission probabilities (Pγ) is important in order to obtain more information about its decay. Activities were also obtained by absolute standardization using the sum-peak method and these values were compared to the efficiency curve method. (author)

  7. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  8. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  9. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  10. Using Gaussian Processes to Model Noise in Eclipsing Binary Light Curves

    Science.gov (United States)

    Prsa, Andrej; Hambleton, Kelly M.

    2017-01-01

    The most precise data we have at hand arguably comes from NASA's Kepler mission, for which there is no good flux calibration available since it was designed to measure relative flux changes down to ~20ppm level. Instrumental artifacts thus abound in the data, and they vary with the module, location on the CCD, target brightness, electronic cross-talk, etc. In addition, Kepler's near-uninterrupted mode of observation reveals astrophysical signals and transient phenomena (i.e. spots, flares, protuberances, pulsations, magnetic field features, etc) that are not accounted for in the models. These "nuisance" signals, along with instrumental artifacts, are considered noise when modeling light curves; this noise is highly correlated and it cannot be considered poissonian or gaussian. Detrending non-white noise from light curve data has been an ongoing challenge in modeling eclipsing binary star and exoplanet transit light curves. Here we present an approach using Gaussian Processes (GP) to model noise as part of the overall likelihood function. The likelihood function consists of the eclipsing binary light curve generator PHOEBE, correlated noise model using GP, and a poissonian (shot) noise attributed to the actual stochastic component of the entire noise model. We consider GP parameters and poissonian noise amplitude as free parameters that are being sampled within the likelihood function, so the end result is the posterior probability not only for eclipsing binary model parameters, but for the noise parameters as well. We show that the posteriors of principal parameters are significantly more robust when noise is modeled rigorously compared to modeling detrended data with an eclipsing binary model alone. This work has been funded by NSF grant #1517460.

  11. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    Science.gov (United States)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  12. Estimate of the penetrance of BRCA mutation and the COS software for the assessment of BRCA mutation probability.

    Science.gov (United States)

    Berrino, Jacopo; Berrino, Franco; Francisci, Silvia; Peissel, Bernard; Azzollini, Jacopo; Pensotti, Valeria; Radice, Paolo; Pasanisi, Patrizia; Manoukian, Siranoush

    2015-03-01

    We have designed the user-friendly COS software with the intent to improve estimation of the probability of a family carrying a deleterious BRCA gene mutation. The COS software is similar to the widely-used Bayesian-based BRCAPRO software, but it incorporates improved assumptions on cancer incidence in women with and without a deleterious mutation, takes into account relatives up to the fourth degree and allows researchers to consider an hypothetical third gene or a polygenic model of inheritance. Since breast cancer incidence and penetrance increase over generations, we estimated birth-cohort-specific incidence and penetrance curves. We estimated breast and ovarian cancer penetrance in 384 BRCA1 and 229 BRCA2 mutated families. We tested the COS performance in 436 Italian breast/ovarian cancer families including 79 with BRCA1 and 27 with BRCA2 mutations. The area under receiver operator curve (AUROC) was 84.4 %. The best probability threshold for offering the test was 22.9 %, with sensitivity 80.2 % and specificity 80.3 %. Notwithstanding very different assumptions, COS results were similar to BRCAPRO v6.0.

  13. Hf Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...

  14. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  15. Topological recursion and mirror curves

    CERN Document Server

    Bouchard, Vincent

    2012-01-01

    We study the constant contributions to the free energies obtained through the topological recursion applied to the complex curves mirror to toric Calabi-Yau threefolds. We show that the recursion reproduces precisely the corresponding Gromov-Witten invariants, which can be encoded in powers of the MacMahon function. As a result, we extend the scope of the "remodeling conjecture" to the full free energies, including the constant contributions. In the process we study how the pair of pants decomposition of the mirror curves plays an important role in the topological recursion. We also show that the free energies are not, strictly speaking, symplectic invariants, and that the recursive construction of the free energies does not commute with certain limits of mirror curves.

  16. Laffer Curves and Home Production

    Directory of Open Access Journals (Sweden)

    Kotamäki Mauri

    2017-06-01

    Full Text Available In the earlier related literature, consumption tax rate Laffer curve is found to be strictly increasing (see Trabandt and Uhlig (2011. In this paper, a general equilibrium macro model is augmented by introducing a substitute for private consumption in the form of home production. The introduction of home production brings about an additional margin of adjustment – an increase in consumption tax rate not only decreases labor supply and reduces the consumption tax base but also allows a substitution of market goods with home-produced goods. The main objective of this paper is to show that, after the introduction of home production, the consumption tax Laffer curve exhibits an inverse U-shape. Also the income tax Laffer curves are significantly altered. The result shown in this paper casts doubt on some of the earlier results in the literature.

  17. Rational points on elliptic curves

    CERN Document Server

    Silverman, Joseph H

    2015-01-01

    The theory of elliptic curves involves a pleasing blend of algebra, geometry, analysis, and number theory. This book stresses this interplay as it develops the basic theory, thereby providing an opportunity for advanced undergraduates to appreciate the unity of modern mathematics. At the same time, every effort has been made to use only methods and results commonly included in the undergraduate curriculum. This accessibility, the informal writing style, and a wealth of exercises make Rational Points on Elliptic Curves an ideal introduction for students at all levels who are interested in learning about Diophantine equations and arithmetic geometry. Most concretely, an elliptic curve is the set of zeroes of a cubic polynomial in two variables. If the polynomial has rational coefficients, then one can ask for a description of those zeroes whose coordinates are either integers or rational numbers. It is this number theoretic question that is the main subject of this book. Topics covered include the geometry and ...

  18. Canonical curves with low apolarity

    CERN Document Server

    Ballico, Edoardo; Notari, Roberto

    2010-01-01

    Let $k$ be an algebraically closed field and let $C$ be a non--hyperelliptic smooth projective curve of genus $g$ defined over $k$. Since the canonical model of $C$ is arithmetically Gorenstein, Macaulay's theory of inverse systems allows to associate to $C$ a cubic form $f$ in the divided power $k$--algebra $R$ in $g-2$ variables. The apolarity of $C$ is the minimal number $t$ of linear form in $R$ needed to write $f$ as sum of their divided power cubes. It is easy to see that the apolarity of $C$ is at least $g-2$ and P. De Poi and F. Zucconi classified curves with apolarity $g-2$ when $k$ is the complex field. In this paper, we give a complete, characteristic free, classification of curves $C$ with apolarity $g-1$ (and $g-2$).

  19. Curved spacetimes in the lab

    CERN Document Server

    Szpak, Nikodem

    2014-01-01

    We present some new ideas on how to design analogue models of quantum fields living in curved spacetimes using ultra-cold atoms in optical lattices. We discuss various types of static and dynamical curved spacetimes achievable by simple manipulations of the optical setup. Examples presented here contain two-dimensional spaces of positive and negative curvature as well as homogeneous cosmological models and metric waves. Most of them are extendable to three spatial dimensions. We mention some interesting phenomena of quantum field theory in curved spacetimes which might be simulated in such optical lattices loaded with bosonic or fermionic ultra-cold atoms. We also argue that methods of differential geometry can be used, as an alternative mathematical approach, for dealing with realistic inhomogeneous optical lattices.

  20. The New Keynesian Phillips Curve

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper provides a survey on the recent literature on the new Keynesian Phillips curve: the controversies surrounding its microfoundation and estimation, the approaches that have been tried to improve its empirical fit and the challenges it faces adapting to the open-economy framework. The new...... Keynesian Phillips curve has been severely criticized for poor empirical dynamics. Suggested improvements involve making some adjustments to the standard sticky price framework, e.g. introducing backwardness and real rigidities, or abandoning the sticky price model and relying on models of inattentiveness......, learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...

  1. Algebraic curves of maximal cyclicity

    Science.gov (United States)

    Caubergh, Magdalena; Dumortier, Freddy

    2006-01-01

    The paper deals with analytic families of planar vector fields, studying methods to detect the cyclicity of a non-isolated closed orbit, i.e. the maximum number of limit cycles that can locally bifurcate from it. It is known that this multi-parameter problem can be reduced to a single-parameter one, in the sense that there exist analytic curves in parameter space along which the maximal cyclicity can be attained. In that case one speaks about a maximal cyclicity curve (mcc) in case only the number is considered and of a maximal multiplicity curve (mmc) in case the multiplicity is also taken into account. In view of obtaining efficient algorithms for detecting the cyclicity, we investigate whether such mcc or mmc can be algebraic or even linear depending on certain general properties of the families or of their associated Bautin ideal. In any case by well chosen examples we show that prudence is appropriate.

  2. Probably good diagrams for learning: representational epistemic recodification of probability theory.

    Science.gov (United States)

    Cheng, Peter C-H

    2011-07-01

    The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.

  3. Optimal Reliability-Based Planning of Experiments for POD Curves

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.

    Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First O...... Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decision theory. An illustrative example is shown.......Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First...

  4. Shock detachment from curved wedges

    Science.gov (United States)

    Mölder, S.

    2017-03-01

    Curved shock theory is used to show that the flow behind attached shocks on doubly curved wedges can have either positive or negative post-shock pressure gradients depending on the freestream Mach number, the wedge angle and the two wedge curvatures. Given enough wedge length, the flow near the leading edge can choke to force the shock to detach from the wedge. This local choking can preempt both the maximum deflection and the sonic criteria for shock detachment. Analytical predictions for detachment by local choking are supported by CFD results.

  5. Shock detachment from curved wedges

    Science.gov (United States)

    Mölder, S.

    2017-09-01

    Curved shock theory is used to show that the flow behind attached shocks on doubly curved wedges can have either positive or negative post-shock pressure gradients depending on the freestream Mach number, the wedge angle and the two wedge curvatures. Given enough wedge length, the flow near the leading edge can choke to force the shock to detach from the wedge. This local choking can preempt both the maximum deflection and the sonic criteria for shock detachment. Analytical predictions for detachment by local choking are supported by CFD results.

  6. Caloric Curves and Nuclear Expansion

    CERN Document Server

    Natowitz, J B; Ma, Y; Murray, M; Qin, L; Shlomo, S; Wada, R; Wang, J

    2002-01-01

    Nuclear caloric curves have been analyzed using an expanding Fermi gas hypothesis to extract average nuclear densities. In this approach the observed flattening of the caloric curves reflects progressively increasing expansion with increasing excitation energy. This expansion results in a corresponding decrease in the density and Fermi energy of the excited system. For nuclei of medium to heavy mass apparent densities $~0.3\\rho_0$ are reached at the higher excitation energies. The average densities derived in this manner are in good agreement with those derived using other, more complicated, techniques.

  7. Curved branes with regular support

    Energy Technology Data Exchange (ETDEWEB)

    Antoniadis, Ignatios [Sorbonne Universites, LPTHE, UMR CNRS 7589, Paris (France); University of Bern, Albert Einstein Center for Fundamental Physics, ITP, Bern (Switzerland); Cotsakis, Spiros; Klaoudatou, Ifigeneia [American University of the Middle East, Department of Mathematics, P. O. Box 220, Dasman (Kuwait)

    2016-09-15

    We study spacetime singularities in a general five-dimensional braneworld with curved branes satisfying four-dimensional maximal symmetry. The bulk is supported by an analog of perfect fluid with the time replaced by the extra coordinate. We show that contrary to the existence of finite-distance singularities from the brane location in any solution with flat (Minkowski) branes, in the case of curved branes there are singularity-free solutions for a range of equations of state compatible with the null energy condition. (orig.)

  8. THE RELATIONSHIP OF QUARTILE CHARACTERISTIC VALUES BETWEEN DENSITY SURVE AND DISTYIBUTION CURVE

    Institute of Scientific and Technical Information of China (English)

    樊民强; 刘丽俭; 张荣曾

    1998-01-01

    The characteristics of density yield curve of coal and distribution curve of products canbe described with median, quartile deviation, the quartile measure of skewness and kurtosis likeK. On the basis of 16 groups of coal density composition data and their jigging stratification dataderived from the pilot jig, the regression analysis has been done for the relationship between thecharacteristic values of the density curve and the characteristic values of the distribution curve.The results show as follow: (1) The bigger the skewness of the density curve, the bigger theprobable error (Ep) and imperfection (I) are. (2) The bigger the median of density curve, thesmaller the probable error or imperfection is. (3) The characteristic values of density curvehave no influence on the kurtosis K of the distribution curve.

  9. L curve for spherical triangle region quadtrees

    Institute of Scientific and Technical Information of China (English)

    YUAN; Wen; CHENG; Chengqi; MA; Ainai; GUAN; Xiaojing

    2004-01-01

    The sequence of facets and nodes has a direct influence on the efficiency of access to spherical triangle region quadtree. Based on the labeling schema by Lee,spatial curves both for facets and nodes are proposed and the main algorithms for coordinate translation, node L sequence generation and visiting nodes are presented. In particular, constant time algorithms for generating node L sequence are advanced by using bit manipulation operations, which can be easily implemented with hardware. In L curve the distance between three nodes of a facet is mostly limited in a range of small value, thus making fast access possible. Though codes of sibling facets are continuous,the difference between codes of some cousins may occasionally be very large and makes the distance of a few facets also very large, thus greatly increasing the mean node distance and the total traversing distance. Therefore an m cluster of nodes is proposed as a basic storage unit for n cluster, which should store every shared node in each, and the distance between three nodes of a facet is limited to a controllable scope.

  10. Section curve reconstruction and mean-camber curve extraction of a point-sampled blade surface.

    Directory of Open Access Journals (Sweden)

    Wen-long Li

    Full Text Available The blade is one of the most critical parts of an aviation engine, and a small change in the blade geometry may significantly affect the dynamics performance of the aviation engine. Rapid advancements in 3D scanning techniques have enabled the inspection of the blade shape using a dense and accurate point cloud. This paper proposes a new method to achieving two common tasks in blade inspection: section curve reconstruction and mean-camber curve extraction with the representation of a point cloud. The mathematical morphology is expanded and applied to restrain the effect of the measuring defects and generate an ordered sequence of 2D measured points in the section plane. Then, the energy and distance are minimized to iteratively smoothen the measured points, approximate the section curve and extract the mean-camber curve. In addition, a turbine blade is machined and scanned to observe the curvature variation, energy variation and approximation error, which demonstrates the availability of the proposed method. The proposed method is simple to implement and can be applied in aviation casting-blade finish inspection, large forging-blade allowance inspection and visual-guided robot grinding localization.

  11. Development and Application of Flow Duration Curves for Stream Restoration

    Science.gov (United States)

    2016-02-01

    They have traditionally been used for a variety of purposes from hydropower engineering to instream flow quantification. This paper serves to (1...play disproportionately large roles in shaping ERDC TN-EMRRP-SR-49 February 2016 8 channel morphology by doing the most “geomorphic work ” over...Discharge Probability Sediment Rating (ton/da) Geomorphic Work Figure 6. Effective discharge application of flow duration curves for the Etowah River at

  12. Post-Classical Probability Theory

    CERN Document Server

    Barnum, Howard

    2012-01-01

    This paper offers a brief introduction to the framework of "general probabilistic theories", otherwise known as the "convex-operational" approach the foundations of quantum mechanics. Broadly speaking, the goal of research in this vein is to locate quantum mechanics within a very much more general, but conceptually very straightforward, generalization of classical probability theory. The hope is that, by viewing quantum mechanics "from the outside", we may be able better to understand it. We illustrate several respects in which this has proved to be the case, reviewing work on cloning and broadcasting, teleportation and entanglement swapping, key distribution, and ensemble steering in this general framework. We also discuss a recent derivation of the Jordan-algebraic structure of finite-dimensional quantum theory from operationally reasonable postulates.

  13. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.

  14. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  15. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  16. Interpolation and Polynomial Curve Fitting

    Science.gov (United States)

    Yang, Yajun; Gordon, Sheldon P.

    2014-01-01

    Two points determine a line. Three noncollinear points determine a quadratic function. Four points that do not lie on a lower-degree polynomial curve determine a cubic function. In general, n + 1 points uniquely determine a polynomial of degree n, presuming that they do not fall onto a polynomial of lower degree. The process of finding such a…

  17. Variability among polysulphone calibration curves

    Energy Technology Data Exchange (ETDEWEB)

    Casale, G R [University of Rome ' La Sapienza' , Physics Department, P.le A. Moro 2, I-00185, Rome (Italy); Borra, M [ISPESL - Istituto Superiore per la Prevenzione E la Sicurezza del Lavoro, Occupational Hygiene Department, Via Fontana Candida 1, I-0040 Monteporzio Catone (RM) (Italy); Colosimo, A [University of Rome ' La Sapienza' , Department of Human Physiology and Pharmacology, P.le A. Moro 2, I-00185, Rome (Italy); Colucci, M [ISPESL - Istituto Superiore per la Prevenzione E la Sicurezza del Lavoro, Occupational Hygiene Department, Via Fontana Candida 1, I-0040 Monteporzio Catone (RM) (Italy); Militello, A [ISPESL - Istituto Superiore per la Prevenzione E la Sicurezza del Lavoro, Occupational Hygiene Department, Via Fontana Candida 1, I-0040 Monteporzio Catone (RM) (Italy); Siani, A M [University of Rome ' La Sapienza' , Physics Department, P.le A. Moro 2, I-00185, Rome (Italy); Sisto, R [ISPESL - Istituto Superiore per la Prevenzione E la Sicurezza del Lavoro, Occupational Hygiene Department, Via Fontana Candida 1, I-0040 Monteporzio Catone (RM) (Italy)

    2006-09-07

    Within an epidemiological study regarding the correlation between skin pathologies and personal ultraviolet (UV) exposure due to solar radiation, 14 field campaigns using polysulphone (PS) dosemeters were carried out at three different Italian sites (urban, semi-rural and rural) in every season of the year. A polysulphone calibration curve for each field experiment was obtained by measuring the ambient UV dose under almost clear sky conditions and the corresponding change in the PS film absorbance, prior and post exposure. Ambient UV doses were measured by well-calibrated broad-band radiometers and by electronic dosemeters. The dose-response relation was represented by the typical best fit to a third-degree polynomial and it was parameterized by a coefficient multiplying a cubic polynomial function. It was observed that the fit curves differed from each other in the coefficient only. It was assessed that the multiplying coefficient was affected by the solar UV spectrum at the Earth's surface whilst the polynomial factor depended on the photoinduced reaction of the polysulphone film. The mismatch between the polysulphone spectral curve and the CIE erythemal action spectrum was responsible for the variability among polysulphone calibration curves. The variability of the coefficient was related to the total ozone amount and the solar zenith angle. A mathematical explanation of such a parameterization was also discussed.

  18. Space curves, anholonomy and nonlinearity

    Indian Academy of Sciences (India)

    Radha Balakrishnan

    2005-04-01

    Using classical differential geometry, we discuss the phenomenon of anholonomy that gets associated with a static and a moving curve. We obtain the expressions for the respective geometric phases in the two cases and interpret them. We show that there is a close connection between anholonomy and nonlinearity in a wide class of nonlinear systems.

  19. The New Keynesian Phillips Curve

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    , learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...... forecasting in a small open economy like Iceland....

  20. S-shaped learning curves

    NARCIS (Netherlands)

    Murre, J.M.J.

    2014-01-01

    In this article, learning curves for foreign vocabulary words are investigated, distinguishing between a subject-specific learning rate and a material-specific parameter that is related to the complexity of the items, such as the number of syllables. Two experiments are described, one with Turkish w