WorldWideScience

Sample records for generalized pareto distribution

  1. The exponentiated generalized Pareto distribution | Adeyemi | Ife ...

    African Journals Online (AJOL)

    Recently Gupta et al. (1998) introduced the exponentiated exponential distribution as a generalization of the standard exponential distribution. In this paper, we introduce a three-parameter generalized Pareto distribution, the exponentiated generalized Pareto distribution (EGP). We present a comprehensive treatment of the ...

  2. Estimation of the shape parameter of a generalized Pareto distribution based on a transformation to Pareto distributed variables

    OpenAIRE

    van Zyl, J. Martin

    2012-01-01

    Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...

  3. Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder

    1992-01-01

    As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments......-weighted moments. Maintaining the generalized Pareto distribution as the parent exceedance distribution the T-year event is estimated assuming the exceedances to be exponentially distributed. For moderately long-tailed exceedance distributions and small to moderate sample sizes it is found, by comparing mean...... square errors of the T-year event estimators, that the exponential distribution is preferable to the correct generalized Pareto distribution despite the introduced model error and despite a possible rejection of the exponential hypothesis by a test of significance. For moderately short-tailed exceedance...

  4. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...

  5. Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder

    1992-01-01

    As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments...... distributions (with physically justified upper limit) the correct exceedance distribution should be applied despite a possible acceptance of the exponential assumption by a test of significance....

  6. A New Generalization of the Pareto Distribution and Its Application to Insurance Data

    Directory of Open Access Journals (Sweden)

    Mohamed E. Ghitany

    2018-02-01

    Full Text Available The Pareto classical distribution is one of the most attractive in statistics and particularly in the scenario of actuarial statistics and finance. For example, it is widely used when calculating reinsurance premiums. In the last years, many alternative distributions have been proposed to obtain better adjustments especially when the tail of the empirical distribution of the data is very long. In this work, an alternative generalization of the Pareto distribution is proposed and its properties are studied. Finally, application of the proposed model to the earthquake insurance data set is presented.

  7. Improving Modeling of Extreme Events using Generalized Extreme Value Distribution or Generalized Pareto Distribution with Mixing Unconditional Disturbances

    OpenAIRE

    Suarez, R

    2001-01-01

    In this paper an alternative non-parametric historical simulation approach, the Mixing Unconditional Disturbances model with constant volatility, where price paths are generated by reshuffling disturbances for S&P 500 Index returns over the period 1950 - 1998, is used to estimate a Generalized Extreme Value Distribution and a Generalized Pareto Distribution. An ordinary back-testing for period 1999 - 2008 was made to verify this technique, providing higher accuracy returns level under upper ...

  8. Uncertainties of the 50-year wind from short time series using generalized extreme value distribution and generalized Pareto distribution

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, Jakob; Rathmann, Ole

    2015-01-01

    This study examines the various sources to the uncertainties in the application of two widely used extreme value distribution functions, the generalized extreme value distribution (GEVD) and the generalized Pareto distribution (GPD). The study is done through the analysis of measurements from...... as a guideline for applying GEVD and GPD to wind time series of limited length. The data analysis shows that, with reasonable choice of relevant parameters, GEVD and GPD give consistent estimates of the return winds. For GEVD, the base period should be chosen in accordance with the occurrence of the extreme wind...

  9. MATLAB implementation of satellite positioning error overbounding by generalized Pareto distribution

    Science.gov (United States)

    Ahmad, Khairol Amali; Ahmad, Shahril; Hashim, Fakroul Ridzuan

    2018-02-01

    In the satellite navigation community, error overbound has been implemented in the process of integrity monitoring. In this work, MATLAB programming is used to implement the overbounding of satellite positioning error CDF. Using a trajectory of reference, the horizontal position errors (HPE) are computed and its non-parametric distribution function is given by the empirical Cumulative Distribution Function (ECDF). According to the results, these errors have a heavy-tailed distribution. Sınce the ECDF of the HPE in urban environment is not Gaussian distributed, the ECDF is overbound with the CDF of the generalized Pareto distribution (GPD).

  10. Pareto law and Pareto index in the income distribution of Japanese companies

    OpenAIRE

    Ishikawa, Atushi

    2004-01-01

    In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution chang...

  11. Record Values of a Pareto Distribution.

    Science.gov (United States)

    Ahsanullah, M.

    The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…

  12. On the Truncated Pareto Distribution with applications

    OpenAIRE

    Zaninetti, Lorenzo; Ferraro, Mario

    2008-01-01

    The Pareto probability distribution is widely applied in different fields such us finance, physics, hydrology, geology and astronomy. This note deals with an application of the Pareto distribution to astrophysics and more precisely to the statistical analysis of mass of stars and of diameters of asteroids. In particular a comparison between the usual Pareto distribution and its truncated version is presented. Finally a possible physical mechanism that produces Pareto tails for the distributio...

  13. Group Acceptance Sampling Plan for Lifetime Data Using Generalized Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2010-02-01

    Full Text Available In this paper, a group acceptance sampling plan (GASP is introduced for the situations when lifetime of the items follows the generalized Pareto distribution. The design parameters such as minimum group size and acceptance number are determined when the consumer’s risk and the test termination time are specified. The proposed sampling plan is compared with the existing sampling plan. It is concluded that the proposed sampling plan performs better than the existing plan in terms of minimum sample size required to reach the same decision.

  14. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    Science.gov (United States)

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. The Burr X Pareto Distribution: Properties, Applications and VaR Estimation

    Directory of Open Access Journals (Sweden)

    Mustafa Ç. Korkmaz

    2017-12-01

    Full Text Available In this paper, a new three-parameter Pareto distribution is introduced and studied. We discuss various mathematical and statistical properties of the new model. Some estimation methods of the model parameters are performed. Moreover, the peaks-over-threshold method is used to estimate Value-at-Risk (VaR by means of the proposed distribution. We compare the distribution with a few other models to show its versatility in modelling data with heavy tails. VaR estimation with the Burr X Pareto distribution is presented using time series data, and the new model could be considered as an alternative VaR model against the generalized Pareto model for financial institutions.

  16. The exponential age distribution and the Pareto firm size distribution

    OpenAIRE

    Coad, Alex

    2008-01-01

    Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.

  17. An EM Algorithm for Double-Pareto-Lognormal Generalized Linear Model Applied to Heavy-Tailed Insurance Claims

    Directory of Open Access Journals (Sweden)

    Enrique Calderín-Ojeda

    2017-11-01

    Full Text Available Generalized linear models might not be appropriate when the probability of extreme events is higher than that implied by the normal distribution. Extending the method for estimating the parameters of a double Pareto lognormal distribution (DPLN in Reed and Jorgensen (2004, we develop an EM algorithm for the heavy-tailed Double-Pareto-lognormal generalized linear model. The DPLN distribution is obtained as a mixture of a lognormal distribution with a double Pareto distribution. In this paper the associated generalized linear model has the location parameter equal to a linear predictor which is used to model insurance claim amounts for various data sets. The performance is compared with those of the generalized beta (of the second kind and lognorma distributions.

  18. Higher moments method for generalized Pareto distribution in flood frequency analysis

    Science.gov (United States)

    Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.

    2017-08-01

    The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.

  19. GENERALIZED DOUBLE PARETO SHRINKAGE.

    Science.gov (United States)

    Armagan, Artin; Dunson, David B; Lee, Jaeyong

    2013-01-01

    We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t -like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.

  20. Word frequencies: A comparison of Pareto type distributions

    Science.gov (United States)

    Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng

    2018-03-01

    Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.

  1. Rayleigh Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema ‎ Abed Al-Kadim

    2017-12-01

    Full Text Available In this paper Rayleigh Pareto distribution have  introduced denote by( R_PD. We stated some  useful functions. Therefor  we  give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters  so the aim of this search  is to introduce a new distribution

  2. Scaling of Precipitation Extremes Modelled by Generalized Pareto Distribution

    Science.gov (United States)

    Rajulapati, C. R.; Mujumdar, P. P.

    2017-12-01

    Precipitation extremes are often modelled with data from annual maximum series or peaks over threshold series. The Generalized Pareto Distribution (GPD) is commonly used to fit the peaks over threshold series. Scaling of precipitation extremes from larger time scales to smaller time scales when the extremes are modelled with the GPD is burdened with difficulties arising from varying thresholds for different durations. In this study, the scale invariance theory is used to develop a disaggregation model for precipitation extremes exceeding specified thresholds. A scaling relationship is developed for a range of thresholds obtained from a set of quantiles of non-zero precipitation of different durations. The GPD parameters and exceedance rate parameters are modelled by the Bayesian approach and the uncertainty in scaling exponent is quantified. A quantile based modification in the scaling relationship is proposed for obtaining the varying thresholds and exceedance rate parameters for shorter durations. The disaggregation model is applied to precipitation datasets of Berlin City, Germany and Bangalore City, India. From both the applications, it is observed that the uncertainty in the scaling exponent has a considerable effect on uncertainty in scaled parameters and return levels of shorter durations.

  3. Tsallis-Pareto like distributions in hadron-hadron collisions

    International Nuclear Information System (INIS)

    Barnafoeldi, G G; Uermoessy, K; Biro, T S

    2011-01-01

    Non-extensive thermodynamics is a novel approach in high energy physics. In high-energy heavy-ion, and especially in proton-proton collisions we are far from a canonical thermal state, described by the Boltzmann-Gibbs statistic. In these reactions low and intermediate transverse momentum spectra are extremely well reproduced by the Tsallis-Pareto distribution, but the physical origin of Tsallis parameters is still an unsettled question. Here, we analyze whether Tsallis-Pareto energy distribution do overlap with hadron spectra at high-pT. We fitted data, measured in proton-proton (proton-antiproton) collisions in wide center of mass energy range from 200 GeV RHIC up to 7 TeV LHC energies. Furthermore, our test is extended to an investigation of a possible √s-dependence of the power in the Tsallis-Pareto distribution, motivated by QCD evolution equations. We found that Tsallis-Pareto distributions fit well high-pT data, in the wide center of mass energy range. Deviance from the fits appears at p T > 20-30 GeV/c, especially on CDF data. Introducing a pT-scaling ansatz, the fits at low and intermediate transverse momenta still remain good, and the deviations tend to disappear at the highest-pT data.

  4. Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar

    Directory of Open Access Journals (Sweden)

    Graham V. Weinberg

    2012-01-01

    Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.

  5. Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions

    Science.gov (United States)

    Liu, C.; Charpentier, R.R.; Su, J.

    2011-01-01

    Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.

  6. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  7. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  8. Pareto Distribution of Firm Size and Knowledge Spillover Process as a Network

    OpenAIRE

    Tomohiko Konno

    2013-01-01

    The firm size distribution is considered as Pareto distribution. In the present paper, we show that the Pareto distribution of firm size results from the spillover network model which was introduced in Konno (2010).

  9. A Pareto upper tail for capital income distribution

    Science.gov (United States)

    Oancea, Bogdan; Pirjol, Dan; Andrei, Tudorel

    2018-02-01

    We present a study of the capital income distribution and of its contribution to the total income (capital income share) using individual tax income data in Romania, for 2013 and 2014. Using a parametric representation we show that the capital income is Pareto distributed in the upper tail, with a Pareto coefficient α ∼ 1 . 44 which is much smaller than the corresponding coefficient for wage- and non-wage-income (excluding capital income), of α ∼ 2 . 53. Including the capital income contribution has the effect of increasing the overall inequality measures.

  10. Income inequality in Romania: The exponential-Pareto distribution

    Science.gov (United States)

    Oancea, Bogdan; Andrei, Tudorel; Pirjol, Dan

    2017-03-01

    We present a study of the distribution of the gross personal income and income inequality in Romania, using individual tax income data, and both non-parametric and parametric methods. Comparing with official results based on household budget surveys (the Family Budgets Survey and the EU-SILC data), we find that the latter underestimate the income share of the high income region, and the overall income inequality. A parametric study shows that the income distribution is well described by an exponential distribution in the low and middle incomes region, and by a Pareto distribution in the high income region with Pareto coefficient α = 2.53. We note an anomaly in the distribution in the low incomes region (∼9,250 RON), and present a model which explains it in terms of partial income reporting.

  11. [Origination of Pareto distribution in complex dynamic systems].

    Science.gov (United States)

    Chernavskiĭ, D S; Nikitin, A P; Chernavskaia, O D

    2008-01-01

    The Pareto distribution, whose probability density function can be approximated at sufficiently great chi as rho(chi) - chi(-alpha), where alpha > or = 2, is of crucial importance from both the theoretical and practical point of view. The main reason is its qualitative distinction from the normal (Gaussian) distribution. Namely, the probability of high deviations appears to be significantly higher. The conception of the universal applicability of the Gauss law remains to be widely distributed despite the lack of objective confirmation of this notion in a variety of application areas. The origin of the Pareto distribution in dynamic systems located in the gaussian noise field is considered. A simple one-dimensional model is discussed where the system response in a rather wide interval of the variable can be quite precisely approximated by this distribution.

  12. On the size distribution of cities: an economic interpretation of the Pareto coefficient.

    Science.gov (United States)

    Suh, S H

    1987-01-01

    "Both the hierarchy and the stochastic models of size distribution of cities are analyzed in order to explain the Pareto coefficient by economic variables. In hierarchy models, it is found that the rate of variation in the productivity of cities and that in the probability of emergence of cities can explain the Pareto coefficient. In stochastic models, the productivity of cities is found to explain the Pareto coefficient. New city-size distribution functions, in which the Pareto coefficient is decomposed by economic variables, are estimated." excerpt

  13. Generalized Pareto optimum and semi-classical spinors

    Science.gov (United States)

    Rouleux, M.

    2018-02-01

    In 1971, S. Smale presented a generalization of Pareto optimum he called the critical Pareto set. The underlying motivation was to extend Morse theory to several functions, i.e. to find a Morse theory for m differentiable functions defined on a manifold M of dimension ℓ. We use this framework to take a 2 × 2 Hamiltonian ℋ = ℋ(p) ∈ 2 C ∞(T * R 2) to its normal form near a singular point of the Fresnel surface. Namely we say that ℋ has the Pareto property if it decomposes, locally, up to a conjugation with regular matrices, as ℋ(p) = u ‧(p)C(p)(u ‧(p))*, where u : R 2 → R 2 has singularities of codimension 1 or 2, and C(p) is a regular Hermitian matrix (“integrating factor”). In particular this applies in certain cases to the matrix Hamiltonian of Elasticity theory and its (relative) perturbations of order 3 in momentum at the origin.

  14. Improved Shape Parameter Estimation in Pareto Distributed Clutter with Neural Networks

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2016-12-01

    Full Text Available The main problem faced by naval radars is the elimination of the clutter input which is a distortion signal appearing mixed with target reflections. Recently, the Pareto distribution has been related to sea clutter measurements suggesting that it may provide a better fit than other traditional distributions. The authors propose a new method for estimating the Pareto shape parameter based on artificial neural networks. The solution achieves a precise estimation of the parameter, having a low computational cost, and outperforming the classic method which uses Maximum Likelihood Estimates (MLE. The presented scheme contributes to the development of the NATE detector for Pareto clutter, which uses the knowledge of clutter statistics for improving the stability of the detection, among other applications.

  15. Seasonal and Non-Seasonal Generalized Pareto Distribution to Estimate Extreme Significant Wave Height in The Banda Sea

    Science.gov (United States)

    Nursamsiah; Nugroho Sugianto, Denny; Suprijanto, Jusup; Munasik; Yulianto, Bambang

    2018-02-01

    The information of extreme wave height return level was required for maritime planning and management. The recommendation methods in analyzing extreme wave were better distributed by Generalized Pareto Distribution (GPD). Seasonal variation was often considered in the extreme wave model. This research aims to identify the best model of GPD by considering a seasonal variation of the extreme wave. By using percentile 95 % as the threshold of extreme significant wave height, the seasonal GPD and non-seasonal GPD fitted. The Kolmogorov-Smirnov test was applied to identify the goodness of fit of the GPD model. The return value from seasonal and non-seasonal GPD was compared with the definition of return value as criteria. The Kolmogorov-Smirnov test result shows that GPD fits data very well both seasonal and non-seasonal model. The seasonal return value gives better information about the wave height characteristics.

  16. Zipf's law and influential factors of the Pareto exponent of the city size distribution: Evidence from China

    OpenAIRE

    GAO Hongying; WU Kangping

    2007-01-01

    This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the P...

  17. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus

    2013-01-01

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained. (paper)

  18. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning.

    Science.gov (United States)

    Bokrantz, Rasmus

    2013-06-07

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained.

  19. Sebaran Generalized Extreme Value (GEV dan Generalized Pareto (GP untuk Pendugaan Curah Hujan Ekstrim di Wilayah DKI Jakarta

    Directory of Open Access Journals (Sweden)

    Achi Rinaldi

    2016-06-01

    Full Text Available Extreme event such as extreme rainfall have been analyzed and most concern for the country all around the world. There are two common distribution for extreme value which are Generalized Extreme Value distribution and Generalized Pareto distribution. These two distribution have shown good performace to estimate the parameter of  extreme value. This research was aim to estimate parameter of extreme value using GEV distribution and GP distribution, and also to characterized effect of extreme event such as flood. The rainfall data was taken from BMKG for 5 location in DKI Jakarta. Both of distribution shown a good perfromance. The resut showed that Tanjung Priok station has biggest location parameter for GEV and also the biggest scale parameter for GP, that mean the biggest probability to take flood effect of the extreme rainfall.

  20. Random phenotypic variation of yeast (Saccharomyces cerevisiae) single-gene knockouts fits a double pareto-lognormal distribution.

    Science.gov (United States)

    Graham, John H; Robb, Daniel T; Poe, Amy R

    2012-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of

  1. Income dynamics with a stationary double Pareto distribution.

    Science.gov (United States)

    Toda, Alexis Akira

    2011-04-01

    Once controlled for the trend, the distribution of personal income appears to be double Pareto, a distribution that obeys the power law exactly in both the upper and the lower tails. I propose a model of income dynamics with a stationary distribution that is consistent with this fact. Using US male wage data for 1970-1993, I estimate the power law exponent in two ways--(i) from each cross section, assuming that the distribution has converged to the stationary distribution, and (ii) from a panel directly estimating the parameters of the income dynamics model--and obtain the same value of 8.4.

  2. Use of the truncated shifted Pareto distribution in assessing size distribution of oil and gas fields

    Science.gov (United States)

    Houghton, J.C.

    1988-01-01

    The truncated shifted Pareto (TSP) distribution, a variant of the two-parameter Pareto distribution, in which one parameter is added to shift the distribution right and left and the right-hand side is truncated, is used to model size distributions of oil and gas fields for resource assessment. Assumptions about limits to the left-hand and right-hand side reduce the number of parameters to two. The TSP distribution has advantages over the more customary lognormal distribution because it has a simple analytic expression, allowing exact computation of several statistics of interest, has a "J-shape," and has more flexibility in the thickness of the right-hand tail. Oil field sizes from the Minnelusa play in the Powder River Basin, Wyoming and Montana, are used as a case study. Probability plotting procedures allow easy visualization of the fit and help the assessment. ?? 1988 International Association for Mathematical Geology.

  3. Computing the Distribution of Pareto Sums Using Laplace Transformation and Stehfest Inversion

    Science.gov (United States)

    Harris, C. K.; Bourne, S. J.

    2017-05-01

    In statistical seismology, the properties of distributions of total seismic moment are important for constraining seismological models, such as the strain partitioning model (Bourne et al. J Geophys Res Solid Earth 119(12): 8991-9015, 2014). This work was motivated by the need to develop appropriate seismological models for the Groningen gas field in the northeastern Netherlands, in order to address the issue of production-induced seismicity. The total seismic moment is the sum of the moments of individual seismic events, which in common with many other natural processes, are governed by Pareto or "power law" distributions. The maximum possible moment for an induced seismic event can be constrained by geomechanical considerations, but rather poorly, and for Groningen it cannot be reliably inferred from the frequency distribution of moment magnitude pertaining to the catalogue of observed events. In such cases it is usual to work with the simplest form of the Pareto distribution without an upper bound, and we follow the same approach here. In the case of seismicity, the exponent β appearing in the power-law relation is small enough for the variance of the unbounded Pareto distribution to be infinite, which renders standard statistical methods concerning sums of statistical variables, based on the central limit theorem, inapplicable. Determinations of the properties of sums of moderate to large numbers of Pareto-distributed variables with infinite variance have traditionally been addressed using intensive Monte Carlo simulations. This paper presents a novel method for accurate determination of the properties of such sums that is accurate, fast and easily implemented, and is applicable to Pareto-distributed variables for which the power-law exponent β lies within the interval [0, 1]. It is based on shifting the original variables so that a non-zero density is obtained exclusively for non-negative values of the parameter and is identically zero elsewhere, a property

  4. Critical review and hydrologic application of threshold detection methods for the generalized Pareto (GP) distribution

    Science.gov (United States)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto

    2016-04-01

    Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the

  5. A hybrid pareto mixture for conditional asymmetric fat-tailed distributions.

    Science.gov (United States)

    Carreau, Julie; Bengio, Yoshua

    2009-07-01

    In many cases, we observe some variables X that contain predictive information over a scalar variable of interest Y , with (X,Y) pairs observed in a training set. We can take advantage of this information to estimate the conditional density p(Y|X = x). In this paper, we propose a conditional mixture model with hybrid Pareto components to estimate p(Y|X = x). The hybrid Pareto is a Gaussian whose upper tail has been replaced by a generalized Pareto tail. A third parameter, in addition to the location and spread parameters of the Gaussian, controls the heaviness of the upper tail. Using the hybrid Pareto in a mixture model results in a nonparametric estimator that can adapt to multimodality, asymmetry, and heavy tails. A conditional density estimator is built by modeling the parameters of the mixture estimator as functions of X. We use a neural network to implement these functions. Such conditional density estimators have important applications in many domains such as finance and insurance. We show experimentally that this novel approach better models the conditional density in terms of likelihood, compared to competing algorithms: conditional mixture models with other types of components and a classical kernel-based nonparametric model.

  6. Active learning of Pareto fronts.

    Science.gov (United States)

    Campigotto, Paolo; Passerini, Andrea; Battiti, Roberto

    2014-03-01

    This paper introduces the active learning of Pareto fronts (ALP) algorithm, a novel approach to recover the Pareto front of a multiobjective optimization problem. ALP casts the identification of the Pareto front into a supervised machine learning task. This approach enables an analytical model of the Pareto front to be built. The computational effort in generating the supervised information is reduced by an active learning strategy. In particular, the model is learned from a set of informative training objective vectors. The training objective vectors are approximated Pareto-optimal vectors obtained by solving different scalarized problem instances. The experimental results show that ALP achieves an accurate Pareto front approximation with a lower computational effort than state-of-the-art estimation of distribution algorithms and widely known genetic techniques.

  7. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  8. Kullback-Leibler divergence and the Pareto-Exponential approximation.

    Science.gov (United States)

    Weinberg, G V

    2016-01-01

    Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.

  9. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    Science.gov (United States)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General

  10. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  11. A Pareto scale-inflated outlier model and its Bayesian analysis

    OpenAIRE

    Scollnik, David P. M.

    2016-01-01

    This paper develops a Pareto scale-inflated outlier model. This model is intended for use when data from some standard Pareto distribution of interest is suspected to have been contaminated with a relatively small number of outliers from a Pareto distribution with the same shape parameter but with an inflated scale parameter. The Bayesian analysis of this Pareto scale-inflated outlier model is considered and its implementation using the Gibbs sampler is discussed. The paper contains three wor...

  12. Origin of Pareto-like spatial distributions in ecosystems.

    Science.gov (United States)

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  13. PARETO OPTIMAL SOLUTIONS FOR MULTI-OBJECTIVE GENERALIZED ASSIGNMENT PROBLEM

    Directory of Open Access Journals (Sweden)

    S. Prakash

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The Multi-Objective Generalized Assignment Problem (MGAP with two objectives, where one objective is linear and the other one is non-linear, has been considered, with the constraints that a job is assigned to only one worker – though he may be assigned more than one job, depending upon the time available to him. An algorithm is proposed to find the set of Pareto optimal solutions of the problem, determining assignments of jobs to workers with two objectives without setting priorities for them. The two objectives are to minimise the total cost of the assignment and to reduce the time taken to complete all the jobs.

    AFRIKAANSE OPSOMMING: ‘n Multi-doelwit veralgemeende toekenningsprobleem (“multi-objective generalised assignment problem – MGAP” met twee doelwitte, waar die een lineêr en die ander nielineêr is nie, word bestudeer, met die randvoorwaarde dat ‘n taak slegs toegedeel word aan een werker – alhoewel meer as een taak aan hom toegedeel kan word sou die tyd beskikbaar wees. ‘n Algoritme word voorgestel om die stel Pareto-optimale oplossings te vind wat die taaktoedelings aan werkers onderhewig aan die twee doelwitte doen sonder dat prioriteite toegeken word. Die twee doelwitte is om die totale koste van die opdrag te minimiseer en om die tyd te verminder om al die take te voltooi.

  14. A Note on Parameter Estimation in the Composite Weibull–Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Enrique Calderín-Ojeda

    2018-02-01

    Full Text Available Composite models have received much attention in the recent actuarial literature to describe heavy-tailed insurance loss data. One of the models that presents a good performance to describe this kind of data is the composite Weibull–Pareto (CWL distribution. On this note, this distribution is revisited to carry out estimation of parameters via mle and mle2 optimization functions in R. The results are compared with those obtained in a previous paper by using the nlm function, in terms of analytical and graphical methods of model selection. In addition, the consistency of the parameter estimation is examined via a simulation study.

  15. Computing the Moments of Order Statistics from Truncated Pareto Distributions Based on the Conditional Expectation

    Directory of Open Access Journals (Sweden)

    Gökhan Gökdere

    2014-05-01

    Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.

  16. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1general bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  17. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    International Nuclear Information System (INIS)

    Agterberg, Frits

    2017-01-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  18. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    Energy Technology Data Exchange (ETDEWEB)

    Agterberg, Frits, E-mail: agterber@nrcan.gc.ca [Geological Survey of Canada (Canada)

    2017-07-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  19. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Directory of Open Access Journals (Sweden)

    Sophie Bertrand

    Full Text Available How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD. GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS, both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1 providing a synthetic and pattern-oriented description of movement, (2 using top predators as ecosystem indicators and (3 studying the variability of spatial behaviour among species or among individuals with different personalities.

  20. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Science.gov (United States)

    Bertrand, Sophie; Joo, Rocío; Fablet, Ronan

    2015-01-01

    How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern-oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities.

  1. Entropies of negative incomes, Pareto-distributed loss, and financial crises.

    Science.gov (United States)

    Gao, Jianbo; Hu, Jing; Mao, Xiang; Zhou, Mi; Gurbaxani, Brian; Lin, Johnny

    2011-01-01

    Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.

  2. Entropies of negative incomes, Pareto-distributed loss, and financial crises.

    Directory of Open Access Journals (Sweden)

    Jianbo Gao

    Full Text Available Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.

  3. An Investigation of the Pareto Distribution as a Model for High Grazing Angle Clutter

    Science.gov (United States)

    2011-03-01

    radar detection schemes under controlled conditions. Complicated clutter models result in mathematical difficulties in the determination of optimal and...a population [7]. It has been used in the modelling of actuarial data; an example is in excess of loss quotations in insurance [8]. Its usefulness as...UNCLASSIFIED modified Bessel functions, making it difficult to employ in radar detection schemes. The Pareto Distribution is amenable to mathematical

  4. Multi-choice stochastic transportation problem involving general form of distributions.

    Science.gov (United States)

    Quddoos, Abdul; Ull Hasan, Md Gulzar; Khalid, Mohammad Masood

    2014-01-01

    Many authors have presented studies of multi-choice stochastic transportation problem (MCSTP) where availability and demand parameters follow a particular probability distribution (such as exponential, weibull, cauchy or extreme value). In this paper an MCSTP is considered where availability and demand parameters follow general form of distribution and a generalized equivalent deterministic model (GMCSTP) of MCSTP is obtained. It is also shown that all previous models obtained by different authors can be deduced with the help of GMCSTP. MCSTP with pareto, power function or burr-XII distributions are also considered and equivalent deterministic models are obtained. To illustrate the proposed model two numerical examples are presented and solved using LINGO 13.0 software package.

  5. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Peng Zuoxiang

    2010-01-01

    Full Text Available Let be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function as , where represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  6. Pareto printsiip

    Index Scriptorium Estoniae

    2011-01-01

    Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm

  7. Approximating convex Pareto surfaces in multiobjective radiotherapy planning

    International Nuclear Information System (INIS)

    Craft, David L.; Halabi, Tarek F.; Shih, Helen A.; Bortfeld, Thomas R.

    2006-01-01

    Radiotherapy planning involves inherent tradeoffs: the primary mission, to treat the tumor with a high, uniform dose, is in conflict with normal tissue sparing. We seek to understand these tradeoffs on a case-to-case basis, by computing for each patient a database of Pareto optimal plans. A treatment plan is Pareto optimal if there does not exist another plan which is better in every measurable dimension. The set of all such plans is called the Pareto optimal surface. This article presents an algorithm for computing well distributed points on the (convex) Pareto optimal surface of a multiobjective programming problem. The algorithm is applied to intensity-modulated radiation therapy inverse planning problems, and results of a prostate case and a skull base case are presented, in three and four dimensions, investigating tradeoffs between tumor coverage and critical organ sparing

  8. An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index

    DEFF Research Database (Denmark)

    Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle

    2013-01-01

    We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...

  9. Pareto utility

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  10. Stable power laws in variable economies; Lotka-Volterra implies Pareto-Zipf

    Science.gov (United States)

    Solomon, S.; Richmond, P.

    2002-05-01

    In recent years we have found that logistic systems of the Generalized Lotka-Volterra type (GLV) describing statistical systems of auto-catalytic elements posses power law distributions of the Pareto-Zipf type. In particular, when applied to economic systems, GLV leads to power laws in the relative individual wealth distribution and in market returns. These power laws and their exponent α are invariant to arbitrary variations in the total wealth of the system and to other endogenously and exogenously induced variations.

  11. Pareto optimality in organelle energy metabolism analysis.

    Science.gov (United States)

    Angione, Claudio; Carapezza, Giovanni; Costanza, Jole; Lió, Pietro; Nicosia, Giuseppe

    2013-01-01

    In low and high eukaryotes, energy is collected or transformed in compartments, the organelles. The rich variety of size, characteristics, and density of the organelles makes it difficult to build a general picture. In this paper, we make use of the Pareto-front analysis to investigate the optimization of energy metabolism in mitochondria and chloroplasts. Using the Pareto optimality principle, we compare models of organelle metabolism on the basis of single- and multiobjective optimization, approximation techniques (the Bayesian Automatic Relevance Determination), robustness, and pathway sensitivity analysis. Finally, we report the first analysis of the metabolic model for the hydrogenosome of Trichomonas vaginalis, which is found in several protozoan parasites. Our analysis has shown the importance of the Pareto optimality for such comparison and for insights into the evolution of the metabolism from cytoplasmic to organelle bound, involving a model order reduction. We report that Pareto fronts represent an asymptotic analysis useful to describe the metabolism of an organism aimed at maximizing concurrently two or more metabolite concentrations.

  12. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Bao Tao

    2010-01-01

    Full Text Available Let {Xn,n≥1} be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function F(x=1−x−1/γlF(x as γ>0, where lF(x represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  13. Evaluation of Preanalytical Quality Indicators by Six Sigma and Pareto`s Principle.

    Science.gov (United States)

    Kulkarni, Sweta; Ramesh, R; Srinivasan, A R; Silvia, C R Wilma Delphine

    2018-01-01

    Preanalytical steps are the major sources of error in clinical laboratory. The analytical errors can be corrected by quality control procedures but there is a need for stringent quality checks in preanalytical area as these processes are done outside the laboratory. Sigma value depicts the performance of laboratory and its quality measures. Hence in the present study six sigma and Pareto principle was applied to preanalytical quality indicators to evaluate the clinical biochemistry laboratory performance. This observational study was carried out for a period of 1 year from November 2015-2016. A total of 1,44,208 samples and 54,265 test requisition forms were screened for preanalytical errors like missing patient information, sample collection details in forms and hemolysed, lipemic, inappropriate, insufficient samples and total number of errors were calculated and converted into defects per million and sigma scale. Pareto`s chart was drawn using total number of errors and cumulative percentage. In 75% test requisition forms diagnosis was not mentioned and sigma value of 0.9 was obtained and for other errors like sample receiving time, stat and type of sample sigma values were 2.9, 2.6, and 2.8 respectively. For insufficient sample and improper ratio of blood to anticoagulant sigma value was 4.3. Pareto`s chart depicts out of 80% of errors in requisition forms, 20% is contributed by missing information like diagnosis. The development of quality indicators, application of six sigma and Pareto`s principle are quality measures by which not only preanalytical, the total testing process can be improved.

  14. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  15. Pareto optimization in algebraic dynamic programming.

    Science.gov (United States)

    Saule, Cédric; Giegerich, Robert

    2015-01-01

    Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.

  16. Lettere di Vilfredo Pareto all’amico Roberto Michels: confini e confine nel Trattato di Sociologia Generale del 1916

    Directory of Open Access Journals (Sweden)

    Raffaele Federici

    2017-08-01

    Full Text Available In questa ricerca di senso fra la fine di un'epoca e la nuova visione del mondo, c’è, nei due Autori, quello che potrebbe chiamarsi una betweenness: Pareto, quasi un franco-italiano, e Michels, un italiano-tedesco, anzi un più che italiano. Nella linea di faglia rappresentata dal primo conflitto mondiale, i due sociologi sono in una doppia relazione interiore appunto franco-italiana Pareto e italo-tedesca Michels e una relazione esteriore fra il mondo di ieri e il mondo successivo al cataclisma che fu la prima guerra mondiale, quando ben quattro imperi colossali erano stati smembrati (l’Impero Russo, l’Impero Tedesco, l’Impero Austro-ungarico e l’Impero ottomano, nello stesso tempo in cui Emile Durkheim guardava con inquietudine alla disgregazione delle vecchie comunità tradizionali, dove il senso della crisi del tempo investe non solo le persone e i comportamenti, ma il mondo logico stesso. Lo scambio epistolare avviene nella stessa terra: Pareto a Celigny, sul lago di Ginevra , e Michels a Basilea , lungo le rive del Reno. Vi è, fra i due sociologi un profondo rispetto, che vedrà Robert Michels dedicare allo “scienziato e amico Vilfredo Pareto con venerazione” un’opera importante come “Problemi di sociologia applicata” pubblicata solo tre anni dopo il Trattato di Sociologia Generale del Maestro. In questa antologia di saggi Robert Michels, probabilmente composti fra il 1914 e il 1917, negli anni del grande cataclisma, anzi concepiti prima «dell’insediamento di questa terribile corte suprema di cassazione di tutte le nostre ideologie, che è la guerra» , quindi contemporanea al Trattato, il Maestro viene citato tre volte, come Max Weber, ma, de facto, la presenza di Pareto è continua. In particolare, il richiamo al Maestro è iscritto a due piste di ricerca: da una parte la realtà della ricerca sociologica e del suo amplissimo spettro di analisi e dall’altra la teoria della circolazione delle elités. È proprio

  17. Identification of Climate Change with Generalized Extreme Value (GEV) Distribution Approach

    International Nuclear Information System (INIS)

    Rahayu, Anita

    2013-01-01

    Some events are difficult to avoid and gives considerable influence to humans and the environment is extreme weather and climate change. Many of the problems that require knowledge about the behavior of extreme values and one of the methods used are the Extreme Value Theory (EVT). EVT used to draw up reliable systems in a variety of conditions, so as to minimize the risk of a major disaster. There are two methods for identifying extreme value, Block Maxima with Generalized Extreme Value (GEV) distribution approach and Peaks over Threshold (POT) with Generalized Pareto Distribution (GPD) approach. This research in Indramayu with January 1961-December 2003 period, the method used is Block Maxima with GEV distribution approach. The result showed that there is no climate change in Indramayu with January 1961-December 2003 period.

  18. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    Science.gov (United States)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  19. Designing Pareto-superior demand-response rate options

    International Nuclear Information System (INIS)

    Horowitz, I.; Woo, C.K.

    2006-01-01

    We explore three voluntary service options-real-time pricing, time-of-use pricing, and curtailable/interruptible service-that a local distribution company might offer its customers in order to encourage them to alter their electricity usage in response to changes in the electricity-spot-market price. These options are simple and practical, and make minimal information demands. We show that each of the options is Pareto-superior ex ante, in that it benefits both the participants and the company offering it, while not affecting the non-participants. The options are shown to be Pareto-superior ex post as well, except under certain exceptional circumstances. (author)

  20. Pareto vs Simmel: residui ed emozioni

    Directory of Open Access Journals (Sweden)

    Silvia Fornari

    2017-08-01

    Full Text Available A cento anni dalla pubblicazione del Trattato di sociologia generale (Pareto 1988 siamo a mantenere vivo ed attuale lo studio paretiano con una rilettura contemporanea del suo pensiero. Ricordato per la grande versatilità intellettuale dagli economisti, rimane lo scienziato rigoroso ed analitico i cui contributi sono ancora discussi a livello internazionale. Noi ne analizzeremo gli aspetti che l’hanno portato ad avvicinarsi all’approccio sociologico, con l’introduzione della nota distinzione dell’azione sociale: logica e non-logica. Una dicotomia utilizzata per dare conto dei cambiamenti sociali riguardanti le modalità d’azione degli uomini e delle donne. Com’è noto le azioni logiche sono quelle che riguardano comportamenti mossi da logicità e raziocinio, in cui vi è una diretta relazione causa-effetto, azioni oggetto di studio degli economisti, e di cui non si occupano i sociologi. Le azioni non-logiche riguardano tutte le tipologie di agire umano che rientrano nel novero delle scienze sociali, e che rappresentano la parte più ampia dell’agire sociale. Sono le azioni guidate dai sentimenti, dall’emotività, dalla superstizione, ecc., illustrate da Pareto nel Trattato di sociologia generale e in saggi successivi, dove riprende anche il concetto di eterogenesi dei fini, formulato per la prima volta da Giambattista Vico. Concetto secondo il quale la storia umana, pur conservando in potenza la realizzazione di certi fini, non è lineare e lungo il suo percorso evolutivo può accadere che l’uomo nel tentativo di raggiungere una finalità arrivi a conclusioni opposte. Pareto collega la definizione del filosofo napoletano alle tipologie di azione sociale e alla loro distinzione (logiche, non-logiche. L’eterogenesi dei fini per Pareto è dunque l’esito di un particolare tipo di azione non-logica dell’essere umano e della collettività.

  1. Multiobjective Optimization of Linear Cooperative Spectrum Sensing: Pareto Solutions and Refinement.

    Science.gov (United States)

    Yuan, Wei; You, Xinge; Xu, Jing; Leung, Henry; Zhang, Tianhang; Chen, Chun Lung Philip

    2016-01-01

    In linear cooperative spectrum sensing, the weights of secondary users and detection threshold should be optimally chosen to minimize missed detection probability and to maximize secondary network throughput. Since these two objectives are not completely compatible, we study this problem from the viewpoint of multiple-objective optimization. We aim to obtain a set of evenly distributed Pareto solutions. To this end, here, we introduce the normal constraint (NC) method to transform the problem into a set of single-objective optimization (SOO) problems. Each SOO problem usually results in a Pareto solution. However, NC does not provide any solution method to these SOO problems, nor any indication on the optimal number of Pareto solutions. Furthermore, NC has no preference over all Pareto solutions, while a designer may be only interested in some of them. In this paper, we employ a stochastic global optimization algorithm to solve the SOO problems, and then propose a simple method to determine the optimal number of Pareto solutions under a computational complexity constraint. In addition, we extend NC to refine the Pareto solutions and select the ones of interest. Finally, we verify the effectiveness and efficiency of the proposed methods through computer simulations.

  2. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  3. Pareto-Zipf law in growing systems with multiplicative interactions

    Science.gov (United States)

    Ohtsuki, Toshiya; Tanimoto, Satoshi; Sekiyama, Makoto; Fujihara, Akihiro; Yamamoto, Hiroshi

    2018-06-01

    Numerical simulations of multiplicatively interacting stochastic processes with weighted selections were conducted. A feedback mechanism to control the weight w of selections was proposed. It becomes evident that when w is moderately controlled around 0, such systems spontaneously exhibit the Pareto-Zipf distribution. The simulation results are universal in the sense that microscopic details, such as parameter values and the type of control and weight, are irrelevant. The central ingredient of the Pareto-Zipf law is argued to be the mild control of interactions.

  4. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  5. Diversity comparison of Pareto front approximations in many-objective optimization.

    Science.gov (United States)

    Li, Miqing; Yang, Shengxiang; Liu, Xiaohui

    2014-12-01

    Diversity assessment of Pareto front approximations is an important issue in the stochastic multiobjective optimization community. Most of the diversity indicators in the literature were designed to work for any number of objectives of Pareto front approximations in principle, but in practice many of these indicators are infeasible or not workable when the number of objectives is large. In this paper, we propose a diversity comparison indicator (DCI) to assess the diversity of Pareto front approximations in many-objective optimization. DCI evaluates relative quality of different Pareto front approximations rather than provides an absolute measure of distribution for a single approximation. In DCI, all the concerned approximations are put into a grid environment so that there are some hyperboxes containing one or more solutions. The proposed indicator only considers the contribution of different approximations to nonempty hyperboxes. Therefore, the computational cost does not increase exponentially with the number of objectives. In fact, the implementation of DCI is of quadratic time complexity, which is fully independent of the number of divisions used in grid. Systematic experiments are conducted using three groups of artificial Pareto front approximations and seven groups of real Pareto front approximations with different numbers of objectives to verify the effectiveness of DCI. Moreover, a comparison with two diversity indicators used widely in many-objective optimization is made analytically and empirically. Finally, a parametric investigation reveals interesting insights of the division number in grid and also offers some suggested settings to the users with different preferences.

  6. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  7. Agent-Based Modelling of the Evolution of the Russian Party System Based on Pareto and Hotelling Distributions. Part II

    Directory of Open Access Journals (Sweden)

    Владимир Геннадьевич Иванов

    2015-12-01

    Full Text Available The given article presents research of the evolution of the Russian party system. The chosen methodology is based on the heuristic potential of agent-based modelling. The author analyzes various scenarios of parties’ competition (applying Pareto distribution in connection with recent increase of the number of political parties. In addition, the author predicts the level of ideological diversity of the parties’ platforms (applying the principles of Hotelling distribution in order to evaluate their potential competitiveness in the struggle for voters.

  8. Axiomatizations of Pareto Equilibria in Multicriteria Games

    NARCIS (Netherlands)

    Voorneveld, M.; Vermeulen, D.; Borm, P.E.M.

    1997-01-01

    We focus on axiomatizations of the Pareto equilibrium concept in multicriteria games based on consistency.Axiomatizations of the Nash equilibrium concept by Peleg and Tijs (1996) and Peleg, Potters, and Tijs (1996) have immediate generalizations.The axiomatization of Norde et al.(1996) cannot be

  9. Pareto optimal pairwise sequence alignment.

    Science.gov (United States)

    DeRonne, Kevin W; Karypis, George

    2013-01-01

    Sequence alignment using evolutionary profiles is a commonly employed tool when investigating a protein. Many profile-profile scoring functions have been developed for use in such alignments, but there has not yet been a comprehensive study of Pareto optimal pairwise alignments for combining multiple such functions. We show that the problem of generating Pareto optimal pairwise alignments has an optimal substructure property, and develop an efficient algorithm for generating Pareto optimal frontiers of pairwise alignments. All possible sets of two, three, and four profile scoring functions are used from a pool of 11 functions and applied to 588 pairs of proteins in the ce_ref data set. The performance of the best objective combinations on ce_ref is also evaluated on an independent set of 913 protein pairs extracted from the BAliBASE RV11 data set. Our dynamic-programming-based heuristic approach produces approximated Pareto optimal frontiers of pairwise alignments that contain comparable alignments to those on the exact frontier, but on average in less than 1/58th the time in the case of four objectives. Our results show that the Pareto frontiers contain alignments whose quality is better than the alignments obtained by single objectives. However, the task of identifying a single high-quality alignment among those in the Pareto frontier remains challenging.

  10. Best Statistical Distribution of flood variables for Johor River in Malaysia

    Science.gov (United States)

    Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.

    2012-12-01

    A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow

  11. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    Energy Technology Data Exchange (ETDEWEB)

    Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Fraunhofer Platz 1, 67663 Kaiserslautern (Germany); Thieke, C, E-mail: katrin.teichert@itwm.fhg.de [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  12. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.

    Science.gov (United States)

    Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.

  13. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2011-01-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  14. Existence of pareto equilibria for multiobjective games without compactness

    OpenAIRE

    Shiraishi, Yuya; Kuroiwa, Daishi

    2013-01-01

    In this paper, we investigate the existence of Pareto and weak Pareto equilibria for multiobjective games without compactness. By employing an existence theorem of Pareto equilibria due to Yu and Yuan([10]), several existence theorems of Pareto and weak Pareto equilibria for the multiobjective games are established in a similar way to Flores-B´azan.

  15. Analysis of extreme drinking in patients with alcohol dependence using Pareto regression.

    Science.gov (United States)

    Das, Sourish; Harel, Ofer; Dey, Dipak K; Covault, Jonathan; Kranzler, Henry R

    2010-05-20

    We developed a novel Pareto regression model with an unknown shape parameter to analyze extreme drinking in patients with Alcohol Dependence (AD). We used the generalized linear model (GLM) framework and the log-link to include the covariate information through the scale parameter of the generalized Pareto distribution. We proposed a Bayesian method based on Ridge prior and Zellner's g-prior for the regression coefficients. Simulation study indicated that the proposed Bayesian method performs better than the existing likelihood-based inference for the Pareto regression.We examined two issues of importance in the study of AD. First, we tested whether a single nucleotide polymorphism within GABRA2 gene, which encodes a subunit of the GABA(A) receptor, and that has been associated with AD, influences 'extreme' alcohol intake and second, the efficacy of three psychotherapies for alcoholism in treating extreme drinking behavior. We found an association between extreme drinking behavior and GABRA2. We also found that, at baseline, men with a high-risk GABRA2 allele had a significantly higher probability of extreme drinking than men with no high-risk allele. However, men with a high-risk allele responded to the therapy better than those with two copies of the low-risk allele. Women with high-risk alleles also responded to the therapy better than those with two copies of the low-risk allele, while women who received the cognitive behavioral therapy had better outcomes than those receiving either of the other two therapies. Among men, motivational enhancement therapy was the best for the treatment of the extreme drinking behavior. Copyright 2010 John Wiley & Sons, Ltd.

  16. Multi-agent Pareto appointment exchanging in hospital patient scheduling

    NARCIS (Netherlands)

    I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2007-01-01

    htmlabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment

  17. Multi-agent Pareto appointment exchanging in hospital patient scheduling

    NARCIS (Netherlands)

    Vermeulen, I.B.; Bohté, S.M.; Somefun, D.J.A.; Poutré, La J.A.

    2007-01-01

    We present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment exchanging algorithm:

  18. Pareto-Efficiency, Hayek’s Marvel, and the Invisible Executor

    OpenAIRE

    Kakarot-Handtke, Egmont

    2014-01-01

    This non-technical contribution to the RWER-Blog deals with the interrelations of market clearing, efficient information processing through the price system, and distribution. The point of entry is a transparent example of Pareto-efficiency taken from the popular book How Markets Fail.

  19. COMPROMISE, OPTIMAL AND TRACTIONAL ACCOUNTS ON PARETO SET

    Directory of Open Access Journals (Sweden)

    V. V. Lahuta

    2010-11-01

    Full Text Available The problem of optimum traction calculations is considered as a problem about optimum distribution of a resource. The dynamic programming solution is based on a step-by-step calculation of set of points of Pareto-optimum values of a criterion function (energy expenses and a resource (time.

  20. RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.

    Science.gov (United States)

    Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A

    2013-12-01

    Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.

  1. Using the Pareto Distribution to Improve Estimates of Topcoded Earnings

    OpenAIRE

    Philip Armour; Richard V. Burkhauser; Jeff Larrimore

    2014-01-01

    Inconsistent censoring in the public-use March Current Population Survey (CPS) limits its usefulness in measuring labor earnings trends. Using Pareto estimation methods with less-censored internal CPS data, we create an enhanced cell-mean series to capture top earnings in the public-use CPS. We find that previous approaches for imputing topcoded earnings systematically understate top earnings. Annual earnings inequality trends since 1963 using our series closely approximate those found by Kop...

  2. Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin

    Science.gov (United States)

    Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.

    2018-01-01

    Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.

  3. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  4. The size distributions of all Indian cities

    Science.gov (United States)

    Luckstead, Jeff; Devadoss, Stephen; Danforth, Diana

    2017-05-01

    We apply five distributions-lognormal, double-Pareto lognormal, lognormal-upper tail Pareto, Pareto tails-lognormal, and Pareto tails-lognormal with differentiability restrictions-to estimate the size distribution of all Indian cities. Since India contains numerous small cities, it is important to explicitly model the lower-tail behavior for studying the distribution of all Indian cities. Our results rigorously confirm, using both graphical and formal statistical tests, that among these five distributions, Pareto tails-lognormal is a better suited parametrization of the Indian city size data, verifying that the Indian city size distribution exhibits a strong reverse Pareto in the lower tail, lognormal in the mid-range body, and Pareto in the upper tail.

  5. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  6. Pareto-optimal multi-objective design of airplane control systems

    Science.gov (United States)

    Schy, A. A.; Johnson, K. G.; Giesy, D. P.

    1980-01-01

    A constrained minimization algorithm for the computer aided design of airplane control systems to meet many requirements over a set of flight conditions is generalized using the concept of Pareto-optimization. The new algorithm yields solutions on the boundary of the achievable domain in objective space in a single run, whereas the older method required a sequence of runs to approximate such a limiting solution. However, Pareto-optimality does not guarantee a satisfactory design, since such solutions may emphasize some objectives at the expense of others. The designer must still interact with the program to obtain a well-balanced set of objectives. Using the example of a fighter lateral stability augmentation system (SAS) design over five flight conditions, several effective techniques are developed for obtaining well-balanced Pareto-optimal solutions. For comparison, one of these techniques is also used in a recently developed algorithm of Kreisselmeier and Steinhauser, which replaces the hard constraints with soft constraints, using a special penalty function. It is shown that comparable results can be obtained.

  7. Modeling air quality in main cities of Peninsular Malaysia by using a generalized Pareto model.

    Science.gov (United States)

    Masseran, Nurulkamal; Razali, Ahmad Mahir; Ibrahim, Kamarulzaman; Latif, Mohd Talib

    2016-01-01

    The air pollution index (API) is an important figure used for measuring the quality of air in the environment. The API is determined based on the highest average value of individual indices for all the variables which include sulfur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3), and suspended particulate matter (PM10) at a particular hour. API values that exceed the limit of 100 units indicate an unhealthy status for the exposed environment. This study investigates the risk of occurrences of API values greater than 100 units for eight urban areas in Peninsular Malaysia for the period of January 2004 to December 2014. An extreme value model, known as the generalized Pareto distribution (GPD), has been fitted to the API values found. Based on the fitted model, return period for describing the occurrences of API exceeding 100 in the different cities has been computed as the indicator of risk. The results obtained indicated that most of the urban areas considered have a very small risk of occurrence of the unhealthy events, except for Kuala Lumpur, Malacca, and Klang. However, among these three cities, it is found that Klang has the highest risk. Based on all the results obtained, the air quality standard in urban areas of Peninsular Malaysia falls within healthy limits to human beings.

  8. The Forbes 400, the Pareto power-law and efficient markets

    Science.gov (United States)

    Klass, O. S.; Biham, O.; Levy, M.; Malcai, O.; Solomon, S.

    2007-01-01

    Statistical regularities at the top end of the wealth distribution in the United States are examined using the Forbes 400 lists of richest Americans, published between 1988 and 2003. It is found that the wealths are distributed according to a power-law (Pareto) distribution. This result is explained using a simple stochastic model of multiple investors that incorporates the efficient market hypothesis as well as the multiplicative nature of financial market fluctuations.

  9. Bayesian modeling to paired comparison data via the Pareto distribution

    Directory of Open Access Journals (Sweden)

    Nasir Abbas

    2017-12-01

    Full Text Available A probabilistic approach to build models for paired comparison experiments based on the comparison of two Pareto variables is considered. Analysis of the proposed model is carried out in classical as well as Bayesian frameworks. Informative and uninformative priors are employed to accommodate the prior information. Simulation study is conducted to assess the suitablily and performance of the model under theoretical conditions. Appropriateness of fit of the is also carried out. Entire inferential procedure is illustrated by comparing certain cricket teams using real dataset.

  10. Feasibility of identification of gamma knife planning strategies by identification of pareto optimal gamma knife plans.

    Science.gov (United States)

    Giller, C A

    2011-12-01

    The use of conformity indices to optimize Gamma Knife planning is common, but does not address important tradeoffs between dose to tumor and normal tissue. Pareto analysis has been used for this purpose in other applications, but not for Gamma Knife (GK) planning. The goal of this work is to use computer models to show that Pareto analysis may be feasible for GK planning to identify dosimetric tradeoffs. We define a GK plan A to be Pareto dominant to B if the prescription isodose volume of A covers more tumor but not more normal tissue than B, or if A covers less normal tissue but not less tumor than B. A plan is Pareto optimal if it is not dominated by any other plan. Two different Pareto optimal plans represent different tradeoffs between dose to tumor and normal tissue, because neither plan dominates the other. 'GK simulator' software calculated dose distributions for GK plans, and was called repetitively by a genetic algorithm to calculate Pareto dominant plans. Three irregular tumor shapes were tested in 17 trials using various combinations of shots. The mean number of Pareto dominant plans/trial was 59 ± 17 (sd). Different planning strategies were identified by large differences in shot positions, and 70 of the 153 coordinate plots (46%) showed differences of 5mm or more. The Pareto dominant plans dominated other nearby plans. Pareto dominant plans represent different dosimetric tradeoffs and can be systematically calculated using genetic algorithms. Automatic identification of non-intuitive planning strategies may be feasible with these methods.

  11. Market Ecology, Pareto Wealth Distribution and Leptokurtic Returns in Microscopic Simulation of the LLS Stock Market Model

    Science.gov (United States)

    Solomon, Sorin; Levy, Moshe

    2001-06-01

    The LLS stock market model (see Levy Levy and Solomon Academic Press 2000 "Microscopic Simulation of Financial Markets; From Investor Behavior to Market Phenomena" for a review) is a model of heterogeneous quasi-rational investors operating in a complex environment about which they have incomplete information. We review the main features of this model and several of its extensions. We study the effects of investor heterogeneity and show that predation, competition, or symbiosis may occur between different investor populations. The dynamics of the LLS model lead to the empirically observed Pareto wealth distribution. Many properties observed in actual markets appear as natural consequences of the LLS dynamics: - truncated Levy distribution of short-term returns, - excess volatility, - a return autocorrelation "U-shape" pattern, and - a positive correlation between volume and absolute returns.

  12. Post Pareto optimization-A case

    Science.gov (United States)

    Popov, Stoyan; Baeva, Silvia; Marinova, Daniela

    2017-12-01

    Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.

  13. Monopoly, Pareto and Ramsey mark-ups

    OpenAIRE

    Ten Raa, T.

    2009-01-01

    Monopoly prices are too high. It is a price level problem, in the sense that the relative mark-ups have Ramsey optimal proportions, at least for independent constant elasticity demands. I show that this feature of monopoly prices breaks down the moment one demand is replaced by the textbook linear demand or, even within the constant elasticity framework, dependence is introduced. The analysis provides a single Generalized Inverse Elasticity Rule for the problems of monopoly, Pareto and Ramsey.

  14. Pareto fronts in clinical practice for pinnacle.

    Science.gov (United States)

    Janssen, Tomas; van Kesteren, Zdenko; Franssen, Gijs; Damen, Eugène; van Vliet, Corine

    2013-03-01

    Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. To generate the Pareto fronts, we used the native scripting language of Pinnacle(3) (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI(95%)) by 0.02 (P=.005), and the rectal wall V(65 Gy) by 1.1% (P=.008). We showed the feasibility of automatically generating Pareto fronts with Pinnacle(3). Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Pareto Fronts in Clinical Practice for Pinnacle

    International Nuclear Information System (INIS)

    Janssen, Tomas; Kesteren, Zdenko van; Franssen, Gijs; Damen, Eugène; Vliet, Corine van

    2013-01-01

    Purpose: Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. Methods and Materials: To generate the Pareto fronts, we used the native scripting language of Pinnacle 3 (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Results: Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI 95% ) by 0.02 (P=.005), and the rectal wall V 65 Gy by 1.1% (P=.008). Conclusions: We showed the feasibility of automatically generating Pareto fronts with Pinnacle 3 . Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT

  16. An Evolutionary Efficiency Alternative to the Notion of Pareto Efficiency

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2012-01-01

    textabstractThe paper argues that the notion of Pareto efficiency builds on two normative assumptions: the more general consequentialist norm of any efficiency criterion, and the strong no-harm principle of the prohibition of any redistribution during the economic process that hurts at least one

  17. Pareto Optimal Solutions for Network Defense Strategy Selection Simulator in Multi-Objective Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Yang Sun

    2018-01-01

    Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.

  18. Rank distributions: A panoramic macroscopic outlook

    Science.gov (United States)

    Eliazar, Iddo I.; Cohen, Morrel H.

    2014-01-01

    This paper presents a panoramic macroscopic outlook of rank distributions. We establish a general framework for the analysis of rank distributions, which classifies them into five macroscopic "socioeconomic" states: monarchy, oligarchy-feudalism, criticality, socialism-capitalism, and communism. Oligarchy-feudalism is shown to be characterized by discrete macroscopic rank distributions, and socialism-capitalism is shown to be characterized by continuous macroscopic size distributions. Criticality is a transition state between oligarchy-feudalism and socialism-capitalism, which can manifest allometric scaling with multifractal spectra. Monarchy and communism are extreme forms of oligarchy-feudalism and socialism-capitalism, respectively, in which the intrinsic randomness vanishes. The general framework is applied to three different models of rank distributions—top-down, bottom-up, and global—and unveils each model's macroscopic universality and versatility. The global model yields a macroscopic classification of the generalized Zipf law, an omnipresent form of rank distributions observed across the sciences. An amalgamation of the three models establishes a universal rank-distribution explanation for the macroscopic emergence of a prevalent class of continuous size distributions, ones governed by unimodal densities with both Pareto and inverse-Pareto power-law tails.

  19. Coordinated Voltage Control in Distribution Network with the Presence of DGs and Variable Loads Using Pareto and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    José Raúl Castro

    2016-02-01

    Full Text Available This paper presents an efficient algorithm to solve the multi-objective (MO voltage control problem in distribution networks. The proposed algorithm minimizes the following three objectives: voltage variation on pilot buses, reactive power production ratio deviation, and generator voltage deviation. This work leverages two optimization techniques: fuzzy logic to find the optimum value of the reactive power of the distributed generation (DG and Pareto optimization to find the optimal value of the pilot bus voltage so that this produces lower losses under the constraints that the voltage remains within established limits. Variable loads and DGs are taken into account in this paper. The algorithm is tested on an IEEE 13-node test feeder and the results show the effectiveness of the proposed model.

  20. TopN-Pareto Front Search

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-21

    The JMP Add-In TopN-PFS provides an automated tool for finding layered Pareto front to identify the top N solutions from an enumerated list of candidates subject to optimizing multiple criteria. The approach constructs the N layers of Pareto fronts, and then provides a suite of graphical tools to explore the alternatives based on different prioritizations of the criteria. The tool is designed to provide a set of alternatives from which the decision-maker can select the best option for their study goals.

  1. Comprehensive preference optimization of an irreversible thermal engine using pareto based mutable smart bee algorithm and generalized regression neural network

    DEFF Research Database (Denmark)

    Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar

    2013-01-01

    Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... and stochastic algorithms were proposed to facilitate controlling of the engineering systems. In this study, an extended version of mutable smart bee algorithm (MSBA) called Pareto based mutable smart bee (PBMSB) is inspired to cope with multi-objective problems. Besides, a set of benchmark problems and four...... well-known Pareto based optimizing algorithms i.e. multi-objective bee algorithm (MOBA), multi-objective particle swarm optimization (MOPSO) algorithm, non-dominated sorting genetic algorithm (NSGA-II), and strength Pareto evolutionary algorithm (SPEA 2) are utilized to confirm the acceptable...

  2. Pareto joint inversion of 2D magnetotelluric and gravity data

    Science.gov (United States)

    Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek

    2015-04-01

    In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where

  3. Model-based problem solving through symbolic regression via pareto genetic programming

    NARCIS (Netherlands)

    Vladislavleva, E.

    2008-01-01

    Pareto genetic programming methodology is extended by additional generic model selection and generation strategies that (1) drive the modeling engine to creation of models of reduced non-linearity and increased generalization capabilities, and (2) improve the effectiveness of the search for robust

  4. Estimating extreme dry-spell risk in the Middle Ebro valley (Northeastern Spain). a comparative analysis of partial duration series with a General Pareto distribution and annual maxima series with a Gumbel distribution

    NARCIS (Netherlands)

    Vicente-Serrano, S.; Beguería, S.

    2003-01-01

    This paper analyses fifty-year time series of daily precipitation in a region of the middle Ebro valley (northern Spain) in order to predict extreme dry-spell risk. A comparison of observed and estimated maximum dry spells (50-year return period) showed that the Generalised Pareto (GP)

  5. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  6. The application of analytical methods to the study of Pareto - optimal control systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2014-01-01

    Full Text Available The subject of research articles - - methods of multicriteria optimization and their application for parametric synthesis of double-circuit control systems in conditions of inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of a multi-criteria choice - the principle of the Edgeworth - Pareto. Getting Pareto - optimal variants due to inconsistency of individual criteria does not mean reaching a final decision. Set these options only offers the designer (DM.An important issue when using traditional numerical methods is their computational cost. An example is the use of methods of sounding the parameter space, including with use of uniform grids and uniformly distributed sequences. Very complex computational task is the application of computer methods of approximation bounds of Pareto.The purpose of this work is the development of a fairly simple search methods of Pareto - optimal solutions for the case of the criteria set out in the analytical form.The proposed solution is based on the study of the properties of the analytical dependences of criteria. The case is not covered so far in the literature, namely, the topology of the task, in which no touch of indifference curves (lines level. It is shown that for such tasks may be earmarked for compromise solutions. Prepositional use of the angular position of antigradient to the indifference curves in the parameter space relative to the coordinate axes. Formulated propositions on the characteristics of comonotonicity and contramonotonicity and angular characteristics of antigradient to determine Pareto optimal solutions. Considers the General algorithm of calculation: determine the scope of permissible values of parameters; investigates properties comonotonicity and contraventanas; to build an equal level (indifference curves; determined touch type: single sided (task is not strictly multicriteria or bilateral (objective relates to the Pareto

  7. Projections onto the Pareto surface in multicriteria radiation therapy optimization

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-01-01

    Purpose: To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. Methods: The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose–volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. Results: The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose–volume histogram constraints were used. No consistent improvements in target homogeneity were observed. Conclusions: There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan

  8. Projections onto the Pareto surface in multicriteria radiation therapy optimization.

    Science.gov (United States)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-10-01

    To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose-volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose-volume histogram constraints were used. No consistent improvements in target homogeneity were observed. There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan.

  9. A Pareto Optimal Auction Mechanism for Carbon Emission Rights

    Directory of Open Access Journals (Sweden)

    Mingxi Wang

    2014-01-01

    Full Text Available The carbon emission rights do not fit well into the framework of existing multi-item auction mechanisms because of their own unique features. This paper proposes a new auction mechanism which converges to a unique Pareto optimal equilibrium in a finite number of periods. In the proposed auction mechanism, the assignment outcome is Pareto efficient and the carbon emission rights’ resources are efficiently used. For commercial application and theoretical completeness, both discrete and continuous markets—represented by discrete and continuous bid prices, respectively—are examined, and the results show the existence of a Pareto optimal equilibrium under the constraint of individual rationality. With no ties, the Pareto optimal equilibrium can be further proven to be unique.

  10. Determination of Pareto frontier in multi-objective maintenance optimization

    International Nuclear Information System (INIS)

    Certa, Antonella; Galante, Giacomo; Lupo, Toni; Passannanti, Gianfranco

    2011-01-01

    The objective of a maintenance policy generally is the global maintenance cost minimization that involves not only the direct costs for both the maintenance actions and the spare parts, but also those ones due to the system stop for preventive maintenance and the downtime for failure. For some operating systems, the failure event can be dangerous so that they are asked to operate assuring a very high reliability level between two consecutive fixed stops. The present paper attempts to individuate the set of elements on which performing maintenance actions so that the system can assure the required reliability level until the next fixed stop for maintenance, minimizing both the global maintenance cost and the total maintenance time. In order to solve the previous constrained multi-objective optimization problem, an effective approach is proposed to obtain the best solutions (that is the Pareto optimal frontier) among which the decision maker will choose the more suitable one. As well known, describing the whole Pareto optimal frontier generally is a troublesome task. The paper proposes an algorithm able to rapidly overcome this problem and its effectiveness is shown by an application to a case study regarding a complex series-parallel system.

  11. El Principio de Pareto en el control documental de programas informativos televisivos: implicaciones en el Media Asset Management

    Directory of Open Access Journals (Sweden)

    Jorge Caldera-Serrano

    2015-09-01

    Full Text Available Se analiza la reutilización de las colecciones audiovisuales de las cadenas de televisión con el fin de detectar si se cumple el Índice de Pareto, facilitando mecanismos para su control y explotación de la parte de la colección audiovisual menos utilizada. Se detecta que la correlación de Pareto se establece no sólo en el uso sino también en la presencia de elementos temáticos y elementos onomásticos en el archivo y en la difusión de contenidos, por lo que se plantea formas de control en la integración de información en la colección y de recursos en la difusión. Igualmente se describe el Índice de Pareto, los Media Asset Management y el cambio de paradigma al digital, elementos fundamentales para entender los problemas y las soluciones para la eliminación de problemas en la recuperación y en la conformación de la colección. Abstract: Reuse of audiovisual collections television networks in order to detect whether the Pareto index, providing mechanisms for control and exploitation of the least used part of the audiovisual collection holds analyzed. It is found that the correlation of Pareto is established not only in the use but also the presence of thematic elements and onomastic elements in the file and in the distribution of content, so forms of control arises in the integration of information collection and distributing resources. Likewise, the Pareto index, the Media Asset Management and the paradigm shift to digital, essential to understanding the problems and solutions to eliminate problems in recovery and in the establishment of collection elements described. Keywords:  Information processing. Television. Electronic media. Information systems evaluation.

  12. Modeling fractal structure of city-size distributions using correlation functions.

    Science.gov (United States)

    Chen, Yanguang

    2011-01-01

    Zipf's law is one the most conspicuous empirical facts for cities, however, there is no convincing explanation for the scaling relation between rank and size and its scaling exponent. Using the idea from general fractals and scaling, I propose a dual competition hypothesis of city development to explain the value intervals and the special value, 1, of the power exponent. Zipf's law and Pareto's law can be mathematically transformed into one another, but represent different processes of urban evolution, respectively. Based on the Pareto distribution, a frequency correlation function can be constructed. By scaling analysis and multifractals spectrum, the parameter interval of Pareto exponent is derived as (0.5, 1]; Based on the Zipf distribution, a size correlation function can be built, and it is opposite to the first one. By the second correlation function and multifractals notion, the Pareto exponent interval is derived as [1, 2). Thus the process of urban evolution falls into two effects: one is the Pareto effect indicating city number increase (external complexity), and the other the Zipf effect indicating city size growth (internal complexity). Because of struggle of the two effects, the scaling exponent varies from 0.5 to 2; but if the two effects reach equilibrium with each other, the scaling exponent approaches 1. A series of mathematical experiments on hierarchical correlation are employed to verify the models and a conclusion can be drawn that if cities in a given region follow Zipf's law, the frequency and size correlations will follow the scaling law. This theory can be generalized to interpret the inverse power-law distributions in various fields of physical and social sciences.

  13. Distribution of scholarly publications among academic radiology departments.

    Science.gov (United States)

    Morelli, John N; Bokhari, Danial

    2013-03-01

    The aim of this study was to determine whether the distribution of publications among academic radiology departments in the United States is Gaussian (ie, the bell curve) or Paretian. The search affiliation feature of the PubMed database was used to search for publications in 3 general radiology journals with high Impact Factors, originating at radiology departments in the United States affiliated with residency training programs. The distribution of the number of publications among departments was examined using χ(2) test statistics to determine whether it followed a Pareto or a Gaussian distribution more closely. A total of 14,219 publications contributed since 1987 by faculty members in 163 departments with residency programs were available for assessment. The data acquired were more consistent with a Pareto (χ(2) = 80.4) than a Gaussian (χ(2) = 659.5) distribution. The mean number of publications for departments was 79.9 ± 146 (range, 0-943). The median number of publications was 16.5. The majority (>50%) of major radiology publications from academic departments with residency programs originated in Pareto rather than a normal distribution. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Pareto optimality in infinite horizon linear quadratic differential games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2013-01-01

    In this article we derive conditions for the existence of Pareto optimal solutions for linear quadratic infinite horizon cooperative differential games. First, we present a necessary and sufficient characterization for Pareto optimality which translates to solving a set of constrained optimal

  15. Tsallis distribution as a standard maximum entropy solution with 'tail' constraint

    International Nuclear Information System (INIS)

    Bercher, J.-F.

    2008-01-01

    We show that Tsallis' distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Renyi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the construction. Finally, the 'maximum entropy tail distribution' is identified as a Generalized Pareto Distribution

  16. Pareto 80/20 Law: Derivation via Random Partitioning

    Science.gov (United States)

    Lipovetsky, Stan

    2009-01-01

    The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…

  17. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  18. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    Science.gov (United States)

    Hoffmann, Aswin L.; Siem, Alex Y. D.; den Hertog, Dick; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2006-12-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning.

  19. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    International Nuclear Information System (INIS)

    Hoffmann, Aswin L; Siem, Alex Y D; Hertog, Dick den; Kaanders, Johannes H A M; Huizenga, Henk

    2006-01-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning

  20. Studies on generalized kinetic model and Pareto optimization of a product-driven self-cycling bioprocess.

    Science.gov (United States)

    Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan

    2014-10-01

    The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.

  1. Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach.

    Science.gov (United States)

    Huang, Adam; Li, Jiang; Summers, Ronald M; Petrick, Nicholas; Hara, Amy K

    2010-03-21

    We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (pPareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms.

  2. Can we reach Pareto optimal outcomes using bottom-up approaches?

    NARCIS (Netherlands)

    V. Sanchez-Anguix (Victor); R. Aydoğan (Reyhan); T. Baarslag (Tim); C.M. Jonker (Catholijn)

    2016-01-01

    textabstractClassically, disciplines like negotiation and decision making have focused on reaching Pareto optimal solutions due to its stability and efficiency properties. Despite the fact that many practical and theoretical algorithms have successfully attempted to provide Pareto optimal solutions,

  3. Pareto Improving Price Regulation when the Asset Market is Incomplete

    NARCIS (Netherlands)

    Herings, P.J.J.; Polemarchakis, H.M.

    1999-01-01

    When the asset market is incomplete, competitive equilibria are constrained suboptimal, which provides a scope for pareto improving interventions. Price regulation can be such a pareto improving policy, even when the welfare effects of rationing are taken into account. An appealing aspect of price

  4. Performance-based Pareto optimal design

    NARCIS (Netherlands)

    Sariyildiz, I.S.; Bittermann, M.S.; Ciftcioglu, O.

    2008-01-01

    A novel approach for performance-based design is presented, where Pareto optimality is pursued. Design requirements may contain linguistic information, which is difficult to bring into computation or make consistent their impartial estimations from case to case. Fuzzy logic and soft computing are

  5. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    International Nuclear Information System (INIS)

    Ottosson, Rickard O.; Sjoestroem, David; Behrens, Claus F.; Karlsson, Anna; Engstroem, Per E.; Knoeoes, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head and neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered

  6. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques.

    Science.gov (United States)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David; Behrens, Claus F; Karlsson, Anna; Knöös, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered.

  7. Statement of Problem of Pareto Frontier Management and Its Solution in the Analysis and Synthesis of Optimal Systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2015-01-01

    Full Text Available The article research concerns the multi-criteria optimization (MCO, which assumes that operation quality criteria of the system are independent and specifies a way to improve values of these criteria. Mutual contradiction of some criteria is a major problem in MCO. One of the most important areas of research is to obtain the so-called Pareto - optimal options.The subject of research is Pareto front, also called the Pareto frontier. The article discusses front classifications by its geometric representation for the case of two-criterion task. It presents a mathematical description of the front characteristics using the gradients and their projections. A review of current domestic and foreign literature has revealed that the aim of works in constructing the Pareto frontier is to conduct research in conditions of uncertainty, in the stochastic statement, with no restrictions. A topology both in two- and in three-dimensional case is under consideration. The targets of modern applications are multi-agent systems and groups of players in differential games. However, all considered works have no task to provide an active management of the front.The objective of this article is to discuss the research problem the Pareto frontier in a new production, namely, with the active co-developers of the systems and (or the decision makers (DM in the management of the Pareto frontier. It notes that such formulation differs from the traditionally accepted approach based on the analysis of already existing solutions.The article discusses three ways to describe a quality of the object management system. The first way is to use the direct quality criteria for the model of a closed system as the vibrational level of the General form. The second one is to study a specific two-loop system of an aircraft control using the angular velocity and normal acceleration loops. The third is the use of the integrated quality criteria. In all three cases, the selected criteria are

  8. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning.

    Science.gov (United States)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-09-01

    In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows promise in optimizing the number

  9. Simultaneous navigation of multiple Pareto surfaces, with an application to multicriteria IMRT planning with multiple beam angle configurations.

    Science.gov (United States)

    Craft, David; Monz, Michael

    2010-02-01

    To introduce a method to simultaneously explore a collection of Pareto surfaces. The method will allow radiotherapy treatment planners to interactively explore treatment plans for different beam angle configurations as well as different treatment modalities. The authors assume a convex optimization setting and represent the Pareto surface for each modality or given beam set by a set of discrete points on the surface. Weighted averages of these discrete points produce a continuous representation of each Pareto surface. The authors calculate a set of Pareto surfaces and use linear programming to navigate across the individual surfaces, allowing switches between surfaces. The switches are organized such that the plan profits in the requested way, while trying to keep the change in dose as small as possible. The system is demonstrated on a phantom pancreas IMRT case using 100 different five beam configurations and a multicriteria formulation with six objectives. The system has intuitive behavior and is easy to control. Also, because the underlying linear programs are small, the system is fast enough to offer real-time exploration for the Pareto surfaces of the given beam configurations. The system presented offers a sound starting point for building clinical systems for multicriteria exploration of different modalities and offers a controllable way to explore hundreds of beam angle configurations in IMRT planning, allowing the users to focus their attention on the dose distribution and treatment planning objectives instead of spending excessive time on the technicalities of delivery.

  10. Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2011-01-01

    Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.

  11. Pareto-depth for multiple-query image retrieval.

    Science.gov (United States)

    Hsiao, Ko-Jen; Calder, Jeff; Hero, Alfred O

    2015-02-01

    Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information. In this paper, we consider the content-based image retrieval problem for multiple query images corresponding to different image semantics. We propose a novel multiple-query information retrieval algorithm that combines the Pareto front method with efficient manifold ranking. We show that our proposed algorithm outperforms state of the art multiple-query retrieval algorithms on real-world image databases. We attribute this performance improvement to concavity properties of the Pareto fronts, and prove a theoretical result that characterizes the asymptotic concavity of the fronts.

  12. A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Hájek, J.; Szöllös, A.; Šístek, Jakub

    2010-01-01

    Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro-genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451

  13. A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Hájek, J.; Szöllös, A.; Šístek, Jakub

    2010-01-01

    Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro- genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451

  14. Searching for the Pareto frontier in multi-objective protein design.

    Science.gov (United States)

    Nanda, Vikas; Belure, Sandeep V; Shir, Ofer M

    2017-08-01

    The goal of protein engineering and design is to identify sequences that adopt three-dimensional structures of desired function. Often, this is treated as a single-objective optimization problem, identifying the sequence-structure solution with the lowest computed free energy of folding. However, many design problems are multi-state, multi-specificity, or otherwise require concurrent optimization of multiple objectives. There may be tradeoffs among objectives, where improving one feature requires compromising another. The challenge lies in determining solutions that are part of the Pareto optimal set-designs where no further improvement can be achieved in any of the objectives without degrading one of the others. Pareto optimality problems are found in all areas of study, from economics to engineering to biology, and computational methods have been developed specifically to identify the Pareto frontier. We review progress in multi-objective protein design, the development of Pareto optimization methods, and present a specific case study using multi-objective optimization methods to model the tradeoff between three parameters, stability, specificity, and complexity, of a set of interacting synthetic collagen peptides.

  15. Optimal transmitter power of an intersatellite optical communication system with reciprocal Pareto fading.

    Science.gov (United States)

    Liu, Xian

    2010-02-10

    This paper shows that optical signal transmission over intersatellite links with swaying transmitters can be described as an equivalent fading model. In this model, the instantaneous signal-to-noise ratio is stochastic and follows the reciprocal Pareto distribution. With this model, we show that the transmitter power can be minimized, subject to a specified outage probability, by appropriately adjusting some system parameters, such as the transmitter gain.

  16. Pareto front estimation for decision making.

    Science.gov (United States)

    Giagkiozis, Ioannis; Fleming, Peter J

    2014-01-01

    The set of available multi-objective optimisation algorithms continues to grow. This fact can be partially attributed to their widespread use and applicability. However, this increase also suggests several issues remain to be addressed satisfactorily. One such issue is the diversity and the number of solutions available to the decision maker (DM). Even for algorithms very well suited for a particular problem, it is difficult-mainly due to the computational cost-to use a population large enough to ensure the likelihood of obtaining a solution close to the DM's preferences. In this paper we present a novel methodology that produces additional Pareto optimal solutions from a Pareto optimal set obtained at the end run of any multi-objective optimisation algorithm for two-objective and three-objective problem instances.

  17. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning

    International Nuclear Information System (INIS)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-01-01

    Purpose: In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. Methods: pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. Results: pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows

  18. Vilfredo Pareto. L'economista alla luce delle lettere a Maffeo Pantaleoni. (Vilfredo Pareto. The economist in the light of his letters to Maffeo Pantaleoni

    Directory of Open Access Journals (Sweden)

    E. SCHNEIDER

    2014-07-01

    Full Text Available The article is part of a special issue on occasion of the publication of the entire scientific correspondence of Vilfredo Pareto with Maffeo Pantaleoni. The author reconstructs the beginning of their correspondence, the debate in pure mathematical economics and draws main conclusions on the different views of Pareto with respect to Marshal, Edgeworth and Fisher.JEL: B16, B31, C02, C60

  19. The Pareto Analysis for Establishing Content Criteria in Surgical Training.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-01-01

    Current surgical training is still highly dependent on expensive operating room (OR) experience. Although there have been many attempts to transfer more training to the skills laboratory, little research is focused on which technical behaviors can lead to the highest profit when they are trained outside the OR. The Pareto principle states that in any population that contributes to a common effect, a few account for the bulk of the effect. This principle has been widely used in business management to increase company profits. This study uses the Pareto principle for establishing content criteria for more efficient surgical training. A retrospective study was conducted to assess verbal guidance provided by 9 supervising surgeons to 12 trainees performing 64 laparoscopic cholecystectomies in the OR. The verbal corrections were documented, tallied, and clustered according to the aimed change in novice behavior. The corrections were rank ordered, and a cumulative distribution curve was used to calculate which corrections accounted for 80% of the total number of verbal corrections. In total, 253 different verbal corrections were uttered 1587 times and were categorized into 40 different clusters of aimed changes in novice behaviors. The 35 highest-ranking verbal corrections (14%) and the 11 highest-ranking clusters (28%) accounted for 80% of the total number of given verbal corrections. Following the Pareto principle, we were able to identify the aspects of trainee behavior that account for most corrections given by supervisors during a laparoscopic cholecystectomy on humans. This strategy can be used for the development of new training programs to prepare the trainee in advance for the challenges encountered in the clinical setting in an OR. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. An approach to multiobjective optimization of rotational therapy. II. Pareto optimal surfaces and linear combinations of modulated blocked arcs for a prostate geometry.

    Science.gov (United States)

    Pardo-Montero, Juan; Fenwick, John D

    2010-06-01

    The purpose of this work is twofold: To further develop an approach to multiobjective optimization of rotational therapy treatments recently introduced by the authors [J. Pardo-Montero and J. D. Fenwick, "An approach to multiobjective optimization of rotational therapy," Med. Phys. 36, 3292-3303 (2009)], especially regarding its application to realistic geometries, and to study the quality (Pareto optimality) of plans obtained using such an approach by comparing them with Pareto optimal plans obtained through inverse planning. In the previous work of the authors, a methodology is proposed for constructing a large number of plans, with different compromises between the objectives involved, from a small number of geometrically based arcs, each arc prioritizing different objectives. Here, this method has been further developed and studied. Two different techniques for constructing these arcs are investigated, one based on image-reconstruction algorithms and the other based on more common gradient-descent algorithms. The difficulty of dealing with organs abutting the target, briefly reported in previous work of the authors, has been investigated using partial OAR unblocking. Optimality of the solutions has been investigated by comparison with a Pareto front obtained from inverse planning. A relative Euclidean distance has been used to measure the distance of these plans to the Pareto front, and dose volume histogram comparisons have been used to gauge the clinical impact of these distances. A prostate geometry has been used for the study. For geometries where a blocked OAR abuts the target, moderate OAR unblocking can substantially improve target dose distribution and minimize hot spots while not overly compromising dose sparing of the organ. Image-reconstruction type and gradient-descent blocked-arc computations generate similar results. The Pareto front for the prostate geometry, reconstructed using a large number of inverse plans, presents a hockey-stick shape

  1. Diversity shrinkage: Cross-validating pareto-optimal weights to enhance diversity via hiring practices.

    Science.gov (United States)

    Song, Q Chelsea; Wee, Serena; Newman, Daniel A

    2017-12-01

    To reduce adverse impact potential and improve diversity outcomes from personnel selection, one promising technique is De Corte, Lievens, and Sackett's (2007) Pareto-optimal weighting strategy. De Corte et al.'s strategy has been demonstrated on (a) a composite of cognitive and noncognitive (e.g., personality) tests (De Corte, Lievens, & Sackett, 2008) and (b) a composite of specific cognitive ability subtests (Wee, Newman, & Joseph, 2014). Both studies illustrated how Pareto-weighting (in contrast to unit weighting) could lead to substantial improvement in diversity outcomes (i.e., diversity improvement), sometimes more than doubling the number of job offers for minority applicants. The current work addresses a key limitation of the technique-the possibility of shrinkage, especially diversity shrinkage, in the Pareto-optimal solutions. Using Monte Carlo simulations, sample size and predictor combinations were varied and cross-validated Pareto-optimal solutions were obtained. Although diversity shrinkage was sizable for a composite of cognitive and noncognitive predictors when sample size was at or below 500, diversity shrinkage was typically negligible for a composite of specific cognitive subtest predictors when sample size was at least 100. Diversity shrinkage was larger when the Pareto-optimal solution suggested substantial diversity improvement. When sample size was at least 100, cross-validated Pareto-optimal weights typically outperformed unit weights-suggesting that diversity improvement is often possible, despite diversity shrinkage. Implications for Pareto-optimal weighting, adverse impact, sample size of validation studies, and optimizing the diversity-job performance tradeoff are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Kinetics of wealth and the Pareto law.

    Science.gov (United States)

    Boghosian, Bruce M

    2014-04-01

    An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the so-called Yard-Sale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its long-time limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at small values of wealth, and approximate power-law behavior at large values of wealth.

  3. Towards a seascape typology. I. Zipf versus Pareto laws

    Science.gov (United States)

    Seuront, Laurent; Mitchell, James G.

    Two data analysis methods, referred to as the Zipf and Pareto methods, initially introduced in economics and linguistics two centuries ago and subsequently used in a wide range of fields (word frequency in languages and literature, human demographics, finance, city formation, genomics and physics), are described and proposed here as a potential tool to classify space-time patterns in marine ecology. The aim of this paper is, first, to present the theoretical bases of Zipf and Pareto laws, and to demonstrate that they are strictly equivalent. In that way, we provide a one-to-one correspondence between their characteristic exponents and argue that the choice of technique is a matter of convenience. Second, we argue that the appeal of this technique is that it is assumption-free for the distribution of the data and regularity of sampling interval, as well as being extremely easy to implement. Finally, in order to allow marine ecologists to identify and classify any structure in their data sets, we provide a step by step overview of the characteristic shapes expected for Zipf's law for the cases of randomness, power law behavior, power law behavior contaminated by internal and external noise, and competing power laws illustrated on the basis of typical ecological situations such as mixing processes involving non-interacting and interacting species, phytoplankton growth processes and differential grazing by zooplankton.

  4. Regular distributive efficiency and the distributive liberal social contract.

    OpenAIRE

    Jean Mercier Ythier

    2009-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is, both, Pareto-efficient relative to individual interdependent preferences, and unanimously we...

  5. Determining the distribution of fitness effects using a generalized Beta-Burr distribution.

    Science.gov (United States)

    Joyce, Paul; Abdo, Zaid

    2017-07-12

    In Beisel et al. (2007), a likelihood framework, based on extreme value theory (EVT), was developed for determining the distribution of fitness effects for adaptive mutations. In this paper we extend this framework beyond the extreme distributions and develop a likelihood framework for testing whether or not extreme value theory applies. By making two simple adjustments to the Generalized Pareto Distribution (GPD) we introduce a new simple five parameter probability density function that incorporates nearly every common (continuous) probability model ever used. This means that all of the common models are nested. This has important implications in model selection beyond determining the distribution of fitness effects. However, we demonstrate the use of this distribution utilizing likelihood ratio testing to evaluate alternative distributions to the Gumbel and Weibull domains of attraction of fitness effects. We use a bootstrap strategy, utilizing importance sampling, to determine where in the parameter space will the test be most powerful in detecting deviations from these domains and at what sample size, with focus on small sample sizes (n<20). Our results indicate that the likelihood ratio test is most powerful in detecting deviation from the Gumbel domain when the shape parameters of the model are small while the test is more powerful in detecting deviations from the Weibull domain when these parameters are large. As expected, an increase in sample size improves the power of the test. This improvement is observed to occur quickly with sample size n≥10 in tests related to the Gumbel domain and n≥15 in the case of the Weibull domain. This manuscript is in tribute to the contributions of Dr. Paul Joyce to the areas of Population Genetics, Probability Theory and Mathematical Statistics. A Tribute section is provided at the end that includes Paul's original writing in the first iterations of this manuscript. The Introduction and Alternatives to the GPD sections

  6. Multiclass gene selection using Pareto-fronts.

    Science.gov (United States)

    Rajapakse, Jagath C; Mundra, Piyushkumar A

    2013-01-01

    Filter methods are often used for selection of genes in multiclass sample classification by using microarray data. Such techniques usually tend to bias toward a few classes that are easily distinguishable from other classes due to imbalances of strong features and sample sizes of different classes. It could therefore lead to selection of redundant genes while missing the relevant genes, leading to poor classification of tissue samples. In this manuscript, we propose to decompose multiclass ranking statistics into class-specific statistics and then use Pareto-front analysis for selection of genes. This alleviates the bias induced by class intrinsic characteristics of dominating classes. The use of Pareto-front analysis is demonstrated on two filter criteria commonly used for gene selection: F-score and KW-score. A significant improvement in classification performance and reduction in redundancy among top-ranked genes were achieved in experiments with both synthetic and real-benchmark data sets.

  7. A Knowledge-Informed and Pareto-Based Artificial Bee Colony Optimization Algorithm for Multi-Objective Land-Use Allocation

    Directory of Open Access Journals (Sweden)

    Lina Yang

    2018-02-01

    Full Text Available Land-use allocation is of great significance in urban development. This type of allocation is usually considered to be a complex multi-objective spatial optimization problem, whose optimized result is a set of Pareto-optimal solutions (Pareto front reflecting different tradeoffs in several objectives. However, obtaining a Pareto front is a challenging task, and the Pareto front obtained by state-of-the-art algorithms is still not sufficient. To achieve better Pareto solutions, taking the grid-representative land-use allocation problem with two objectives as an example, an artificial bee colony optimization algorithm for multi-objective land-use allocation (ABC-MOLA is proposed. In this algorithm, the traditional ABC’s search direction guiding scheme and solution maintaining process are modified. In addition, a knowledge-informed neighborhood search strategy, which utilizes the auxiliary knowledge of natural geography and spatial structures to facilitate the neighborhood spatial search around each solution, is developed to further improve the Pareto front’s quality. A series of comparison experiments (a simulated experiment with small data volume and a real-world data experiment for a large area shows that all the Pareto fronts obtained by ABC-MOLA totally dominate the Pareto fronts by other algorithms, which demonstrates ABC-MOLA’s effectiveness in achieving Pareto fronts of high quality.

  8. A Collaborative Neurodynamic Approach to Multiple-Objective Distributed Optimization.

    Science.gov (United States)

    Yang, Shaofu; Liu, Qingshan; Wang, Jun

    2018-04-01

    This paper is concerned with multiple-objective distributed optimization. Based on objective weighting and decision space decomposition, a collaborative neurodynamic approach to multiobjective distributed optimization is presented. In the approach, a system of collaborative neural networks is developed to search for Pareto optimal solutions, where each neural network is associated with one objective function and given constraints. Sufficient conditions are derived for ascertaining the convergence to a Pareto optimal solution of the collaborative neurodynamic system. In addition, it is proved that each connected subsystem can generate a Pareto optimal solution when the communication topology is disconnected. Then, a switching-topology-based method is proposed to compute multiple Pareto optimal solutions for discretized approximation of Pareto front. Finally, simulation results are discussed to substantiate the performance of the collaborative neurodynamic approach. A portfolio selection application is also given.

  9. Tractable Pareto Optimization of Temporal Preferences

    Science.gov (United States)

    Morris, Robert; Morris, Paul; Khatib, Lina; Venable, Brent

    2003-01-01

    This paper focuses on temporal constraint problems where the objective is to optimize a set of local preferences for when events occur. In previous work, a subclass of these problems has been formalized as a generalization of Temporal CSPs, and a tractable strategy for optimization has been proposed, where global optimality is defined as maximizing the minimum of the component preference values. This criterion for optimality, which we call 'Weakest Link Optimization' (WLO), is known to have limited practical usefulness because solutions are compared only on the basis of their worst value; thus, there is no requirement to improve the other values. To address this limitation, we introduce a new algorithm that re-applies WLO iteratively in a way that leads to improvement of all the values. We show the value of this strategy by proving that, with suitable preference functions, the resulting solutions are Pareto Optimal.

  10. Competition and fragmentation: a simple model generating lognormal-like distributions

    International Nuclear Information System (INIS)

    Schwaemmle, V; Queiros, S M D; Brigatti, E; Tchumatchenko, T

    2009-01-01

    The current distribution of language size in terms of speaker population is generally described using a lognormal distribution. Analyzing the original real data we show how the double-Pareto lognormal distribution can give an alternative fit that indicates the existence of a power law tail. A simple Monte Carlo model is constructed based on the processes of competition and fragmentation. The results reproduce the power law tails of the real distribution well and give better results for a poorly connected topology of interactions.

  11. Spectral-Efficiency - Illumination Pareto Front for Energy Harvesting Enabled VLC System

    KAUST Repository

    Abdelhady, Amr Mohamed Abdelaziz

    2017-12-13

    The continuous improvement in optical energy harvesting devices motivates visible light communication (VLC) system developers to utilize such available free energy sources. An outdoor VLC system is considered where an optical base station sends data to multiple users that are capable of harvesting the optical energy. The proposed VLC system serves multiple users using time division multiple access (TDMA) with unequal time and power allocation, which are allocated to improve the system performance. The adopted optical system provides users with illumination and data communication services. The outdoor optical design objective is to maximize the illumination, while the communication design objective is to maximize the spectral efficiency (SE). The design objectives are shown to be conflicting, therefore, a multiobjective optimization problem is formulated to obtain the Pareto front performance curve for the proposed system. To this end, the marginal optimization problems are solved first using low complexity algorithms. Then, based on the proposed algorithms, a low complexity algorithm is developed to obtain an inner bound of the Pareto front for the illumination-SE tradeoff. The inner bound for the Pareto-front is shown to be close to the optimal Pareto-frontier via several simulation scenarios for different system parameters.

  12. Pareto navigation-algorithmic foundation of interactive multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Monz, M; Kuefer, K H; Bortfeld, T R; Thieke, C

    2008-01-01

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle-a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far

  13. Pareto navigation: algorithmic foundation of interactive multi-criteria IMRT planning.

    Science.gov (United States)

    Monz, M; Küfer, K H; Bortfeld, T R; Thieke, C

    2008-02-21

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle -- a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far.

  14. A New Methodology to Select the Preferred Solutions from the Pareto-optimal Set: Application to Polymer Extrusion

    International Nuclear Information System (INIS)

    Ferreira, Jose C.; Gaspar-Cunha, Antonio; Fonseca, Carlos M.

    2007-01-01

    Most of the real world optimization problems involve multiple, usually conflicting, optimization criteria. Generating Pareto optimal solutions plays an important role in multi-objective optimization, and the problem is considered to be solved when the Pareto optimal set is found, i.e., the set of non-dominated solutions. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to produce the complete set of non-dominated solutions. However, this is not allays enough since the aim is not only to know the Pareto set but, also, to obtain one solution from this Pareto set. Thus, the definition of a methodology able to select a single solution from the set of non-dominated solutions (or a region of the Pareto frontier), and taking into account the preferences of a Decision Maker (DM), is necessary. A different method, based on a weighted stress function, is proposed. It is able to integrate the user's preferences in order to find the best region of the Pareto frontier accordingly with these preferences. This method was tested on some benchmark test problems, with two and three criteria, and on a polymer extrusion problem. This methodology is able to select efficiently the best Pareto-frontier region for the specified relative importance of the criteria

  15. Pareto-Optimal Model Selection via SPRINT-Race.

    Science.gov (United States)

    Zhang, Tiantian; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2018-02-01

    In machine learning, the notion of multi-objective model selection (MOMS) refers to the problem of identifying the set of Pareto-optimal models that optimize by compromising more than one predefined objectives simultaneously. This paper introduces SPRINT-Race, the first multi-objective racing algorithm in a fixed-confidence setting, which is based on the sequential probability ratio with indifference zone test. SPRINT-Race addresses the problem of MOMS with multiple stochastic optimization objectives in the proper Pareto-optimality sense. In SPRINT-Race, a pairwise dominance or non-dominance relationship is statistically inferred via a non-parametric, ternary-decision, dual-sequential probability ratio test. The overall probability of falsely eliminating any Pareto-optimal models or mistakenly returning any clearly dominated models is strictly controlled by a sequential Holm's step-down family-wise error rate control method. As a fixed-confidence model selection algorithm, the objective of SPRINT-Race is to minimize the computational effort required to achieve a prescribed confidence level about the quality of the returned models. The performance of SPRINT-Race is first examined via an artificially constructed MOMS problem with known ground truth. Subsequently, SPRINT-Race is applied on two real-world applications: 1) hybrid recommender system design and 2) multi-criteria stock selection. The experimental results verify that SPRINT-Race is an effective and efficient tool for such MOMS problems. code of SPRINT-Race is available at https://github.com/watera427/SPRINT-Race.

  16. Pareto-optimal phylogenetic tree reconciliation.

    Science.gov (United States)

    Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S; Kellis, Manolis

    2014-06-15

    Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. . © The Author 2014. Published by Oxford University Press.

  17. Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks.

    Science.gov (United States)

    Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio

    2010-05-01

    This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.

  18. TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification

    International Nuclear Information System (INIS)

    Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D

    2014-01-01

    Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been

  19. TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D [UCLA Department of Radiation Oncology, Los Angeles, CA (United States)

    2014-06-15

    Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been

  20. Microergodicity effects on ebullition of methane modelled by Mixed Poisson process with Pareto mixing variable

    Czech Academy of Sciences Publication Activity Database

    Jordanova, P.; Dušek, Jiří; Stehlík, M.

    2013-01-01

    Roč. 128, OCT 15 (2013), s. 124-134 ISSN 0169-7439 R&D Projects: GA ČR(CZ) GAP504/11/1151; GA MŠk(CZ) ED1.1.00/02.0073 Institutional support: RVO:67179843 Keywords : environmental chemistry * ebullition of methane * mixed poisson processes * renewal process * pareto distribution * moving average process * robust statistics * sedge–grass marsh Subject RIV: EH - Ecology, Behaviour Impact factor: 2.381, year: 2013

  1. Application of extreme value distribution function in the determination of standard meteorological parameters for nuclear power plants

    International Nuclear Information System (INIS)

    Jiang Haimei; Liu Xinjian; Qiu Lin; Li Fengju

    2014-01-01

    Based on the meteorological data from weather stations around several domestic nuclear power plants, the statistical results of extreme minimum temperatures, minimum. central pressures of tropical cyclones and some other parameters are calculated using extreme value I distribution function (EV- I), generalized extreme value distribution function (GEV) and generalized Pareto distribution function (GP), respectively. The influence of different distribution functions and parameter solution methods on the statistical results of extreme values is investigated. Results indicate that generalized extreme value function has better applicability than the other two distribution functions in the determination of standard meteorological parameters for nuclear power plants. (authors)

  2. Pareto Optimal Design for Synthetic Biology.

    Science.gov (United States)

    Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe

    2015-08-01

    Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.

  3. The Urbanik generalized convolutions in the non-commutative ...

    Indian Academy of Sciences (India)

    −sν(dx) < ∞. Now we apply this construction to the Kendall convolution case, starting with the weakly stable measure δ1. Example 1. Let △ be the Kendall convolution, i.e. the generalized convolution with the probability kernel: δ1△δa = (1 − a)δ1 + aπ2 for a ∈ [0, 1] and π2 be the Pareto distribution with the density π2(dx) =.

  4. Fitting statistical distributions the generalized lambda distribution and generalized bootstrap methods

    CERN Document Server

    Karian, Zaven A

    2000-01-01

    Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...

  5. Risk finance for catastrophe losses with Pareto-calibrated Lévy-stable severities.

    Science.gov (United States)

    Powers, Michael R; Powers, Thomas Y; Gao, Siwei

    2012-11-01

    For catastrophe losses, the conventional risk finance paradigm of enterprise risk management identifies transfer, as opposed to pooling or avoidance, as the preferred solution. However, this analysis does not necessarily account for differences between light- and heavy-tailed characteristics of loss portfolios. Of particular concern are the decreasing benefits of diversification (through pooling) as the tails of severity distributions become heavier. In the present article, we study a loss portfolio characterized by nonstochastic frequency and a class of Lévy-stable severity distributions calibrated to match the parameters of the Pareto II distribution. We then propose a conservative risk finance paradigm that can be used to prepare the firm for worst-case scenarios with regard to both (1) the firm's intrinsic sensitivity to risk and (2) the heaviness of the severity's tail. © 2012 Society for Risk Analysis.

  6. Optimization of Wind Turbine Airfoil Using Nondominated Sorting Genetic Algorithm and Pareto Optimal Front

    Directory of Open Access Journals (Sweden)

    Ziaul Huque

    2012-01-01

    Full Text Available A Computational Fluid Dynamics (CFD and response surface-based multiobjective design optimization were performed for six different 2D airfoil profiles, and the Pareto optimal front of each airfoil is presented. FLUENT, which is a commercial CFD simulation code, was used to determine the relevant aerodynamic loads. The Lift Coefficient (CL and Drag Coefficient (CD data at a range of 0° to 12° angles of attack (α and at three different Reynolds numbers (Re=68,459, 479, 210, and 958, 422 for all the six airfoils were obtained. Realizable k-ε turbulence model with a second-order upwind solution method was used in the simulations. The standard least square method was used to generate response surface by the statistical code JMP. Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II was used to determine the Pareto optimal set based on the response surfaces. Each Pareto optimal solution represents a different compromise between design objectives. This gives the designer a choice to select a design compromise that best suits the requirements from a set of optimal solutions. The Pareto solution set is presented in the form of a Pareto optimal front.

  7. A divide and conquer approach to determine the Pareto frontier for optimization of protein engineering experiments

    Science.gov (United States)

    He, Lu; Friedman, Alan M.; Bailey-Kellogg, Chris

    2016-01-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability vs. novelty, affinity vs. specificity, activity vs. immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not “dominated”; i.e., no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), in order to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, PEPFR (Protein Engineering Pareto FRontier), that hierarchically subdivides the objective space, employing appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. PMID:22180081

  8. A divide-and-conquer approach to determine the Pareto frontier for optimization of protein engineering experiments.

    Science.gov (United States)

    He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris

    2012-03-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability versus novelty, affinity versus specificity, activity versus immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not "dominated"; that is, no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, Protein Engineering Pareto FRontier (PEPFR), that hierarchically subdivides the objective space, using appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. Copyright © 2011 Wiley Periodicals, Inc.

  9. Concentration and size distribution of particles in abstracted groundwater.

    Science.gov (United States)

    van Beek, C G E M; de Zwart, A H; Balemans, M; Kooiman, J W; van Rosmalen, C; Timmer, H; Vandersluys, J; Stuyfzand, P J

    2010-02-01

    Particle number concentrations have been counted and particle size distributions calculated in groundwater derived by abstraction wells. Both concentration and size distribution are governed by the discharge rate: the higher this rate the higher the concentration and the higher the proportion of larger particles. However, the particle concentration in groundwater derived from abstraction wells, with high groundwater flow velocities, is much lower than in groundwater from monitor wells, with minimal flow velocities. This inconsistency points to exhaustion of the particle supply in the aquifer around wells due to groundwater abstraction for many years. The particle size distribution can be described with the help of a power law or Pareto distribution. Comparing the measured particle size distribution with the Pareto distribution shows that particles with a diameter >7 microm are under-represented. As the particle size distribution is dependent on the flow velocity, so is the value of the "Pareto" slope beta. (c) 2009 Elsevier Ltd. All rights reserved.

  10. Ranking of microRNA target prediction scores by Pareto front analysis.

    Science.gov (United States)

    Sahoo, Sudhakar; Albrecht, Andreas A

    2010-12-01

    Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure, which encourages further research towards a higher-dimensional analysis of Pareto fronts. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    Science.gov (United States)

    Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok

    2014-10-01

    A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium-titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.

  12. How Well Do We Know Pareto Optimality?

    Science.gov (United States)

    Mathur, Vijay K.

    1991-01-01

    Identifies sources of ambiguity in economics textbooks' discussion of the condition for efficient output mix. Points out that diverse statements without accompanying explanations create confusion among students. Argues that conflicting views concerning the concept of Pareto optimality as one source of ambiguity. Suggests clarifying additions to…

  13. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  14. An encoding technique for multiobjective evolutionary algorithms applied to power distribution system reconfiguration.

    Science.gov (United States)

    Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E

    2014-01-01

    Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  15. Spatial redistribution of irregularly-spaced Pareto fronts for more intuitive navigation and solution selection

    NARCIS (Netherlands)

    A. Bouter (Anton); K. Pirpinia (Kleopatra); T. Alderliesten (Tanja); P.A.N. Bosman (Peter)

    2017-01-01

    textabstractA multi-objective optimization approach is o.en followed by an a posteriori decision-making process, during which the most appropriate solution of the Pareto set is selected by a professional in the .eld. Conventional visualization methods do not correct for Pareto fronts with

  16. The Aggregation of Individual Distributive Preferences through the Distributive Liberal Social Contract : Normative Analysis.

    OpenAIRE

    Jean Mercier-Ythier

    2010-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is both Pareto-efficient relative to individual interdependent preferences, and unanimously weak...

  17. Transmuted Generalized Inverse Weibull Distribution

    OpenAIRE

    Merovci, Faton; Elbatal, Ibrahim; Ahmed, Alaa

    2013-01-01

    A generalization of the generalized inverse Weibull distribution so-called transmuted generalized inverse Weibull dis- tribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM) in order to generate a flexible family of probability distributions taking generalized inverse Weibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expression...

  18. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  19. Estimations of parameters in Pareto reliability model in the presence of masked data

    International Nuclear Information System (INIS)

    Sarhan, Ammar M.

    2003-01-01

    Estimations of parameters included in the individual distributions of the life times of system components in a series system are considered in this paper based on masked system life test data. We consider a series system of two independent components each has a Pareto distributed lifetime. The maximum likelihood and Bayes estimators for the parameters and the values of the reliability of the system's components at a specific time are obtained. Symmetrical triangular prior distributions are assumed for the unknown parameters to be estimated in obtaining the Bayes estimators of these parameters. Large simulation studies are done in order: (i) explain how one can utilize the theoretical results obtained; (ii) compare the maximum likelihood and Bayes estimates obtained of the underlying parameters; and (iii) study the influence of the masking level and the sample size on the accuracy of the estimates obtained

  20. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  1. Pareto-optimal electricity tariff rates in the Republic of Armenia

    International Nuclear Information System (INIS)

    Kaiser, M.J.

    2000-01-01

    The economic impact of electricity tariff rates on the residential sector of Yerevan, Armenia, is examined. The effect of tariff design on revenue generation and equity measures is considered, and the combination of energy pricing and compensatory social policies which provides the best mix of efficiency and protection for poor households is examined. An equity measure is defined in terms of a cumulative distribution function which describes the percent of the population that spends x percent or less of their income on electricity consumption. An optimal (Pareto-efficient) tariff is designed based on the analysis of survey data and an econometric model, and the Armenian tariff rate effective 1 January 1997 to 15 September 1997 is shown to be non-optimal relative to this rate. 22 refs

  2. Pareto-Optimal Estimates of California Precipitation Change

    Science.gov (United States)

    Langenbrunner, Baird; Neelin, J. David

    2017-12-01

    In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.

  3. Generalized least squares and empirical Bayes estimation in regional partial duration series index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified......A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...

  4. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space

    Science.gov (United States)

    Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri

    2015-01-01

    When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes—phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass. PMID:26465336

  5. A Regionalization Approach to select the final watershed parameter set among the Pareto solutions

    Science.gov (United States)

    Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.

    2017-12-01

    The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.

  6. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.

    Science.gov (United States)

    Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri

    2015-10-01

    When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.

  7. Beam configuration selection for robust intensity-modulated proton therapy in cervical cancer using Pareto front comparison.

    Science.gov (United States)

    van de Schoot, A J A J; Visser, J; van Kesteren, Z; Janssen, T M; Rasch, C R N; Bel, A

    2016-02-21

    The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D(99%)) and OAR doses (rectum V30Gy; bladder V40Gy). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D(99%), rectum V(30Gy) and bladder V(40Gy) to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D(99%) on average by 0.2 Gy and decreased the median rectum V(30Gy) and median bladder V(40Gy) on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal

  8. Beam configuration selection for robust intensity-modulated proton therapy in cervical cancer using Pareto front comparison

    International Nuclear Information System (INIS)

    Van de Schoot, A J A J; Visser, J; Van Kesteren, Z; Rasch, C R N; Bel, A; Janssen, T M

    2016-01-01

    The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D 99% ) and OAR doses (rectum V 30Gy ; bladder V 40Gy ). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D 99% , rectum V 30Gy and bladder V 40Gy to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D 99% on average by 0.2 Gy and decreased the median rectum V 30Gy and median bladder V 40Gy on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal in

  9. A Pareto-Improving Minimum Wage

    OpenAIRE

    Eliav Danziger; Leif Danziger

    2014-01-01

    This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...

  10. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  11. Bi-objective optimization for multi-modal transportation routing planning problem based on Pareto optimality

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2015-09-01

    Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.

  12. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  13. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    DEFF Research Database (Denmark)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David

    2008-01-01

    constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics...

  14. Cylinder Symmetric Measures with the Tail Property

    NARCIS (Netherlands)

    Balkema, A.A.

    2006-01-01

    Abstract: A Pareto distribution has the property that any tail of the distribution has the same shape as the original distribution. The exponential distribution and the uniform distribution have the tail property too. The tail property characterizes the univariate generalized Pareto distributions.

  15. Strength Pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization.

    Science.gov (United States)

    Elhossini, Ahmed; Areibi, Shawki; Dony, Robert

    2010-01-01

    This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.

  16. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.

    Directory of Open Access Journals (Sweden)

    Pablo Szekely

    2015-10-01

    Full Text Available When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.

  17. Comment 1 on workshop in economics - a note on benefit-cost analysis and the distribution of benefits: The greenhouse effect

    International Nuclear Information System (INIS)

    Quinn, K.G.

    1992-01-01

    The application of benefit-cost analysis to environmental problems in general, and to global warming as demonstrated by Kosobud in particular, is a very useful tool. Depending upon the limitations of the relevant data available benefit-cost analysis can offer information to society about how to improve its condition. However, beyond the criticism of its estimate of the Pareto optimal point benefit-cost analysis suffers from a fundamental weakness: It cannot speak to the distribution of the net benefits of implementation of an international greenhouse policy. Within an individual country, debate on a particular policy intervention can effectively separate the issues of achieving a potential Pareto optimum and distributing the benefits necessary to actually accomplish Pareto optimality. This situation occurs because (theoretically, anyway) these decisions are made in the presence of a binding enforcement regime that can redistribute benefits as seen fit. A policy can then be introduced in the manner that achieves the best overall net benefits, and the allocation of these benefits can be treated as a stand-alone problem

  18. Decomposition and Simplification of Multivariate Data using Pareto Sets.

    Science.gov (United States)

    Huettenberger, Lars; Heine, Christian; Garth, Christoph

    2014-12-01

    Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

  19. Calculating and controlling the error of discrete representations of Pareto surfaces in convex multi-criteria optimization.

    Science.gov (United States)

    Craft, David

    2010-10-01

    A discrete set of points and their convex combinations can serve as a sparse representation of the Pareto surface in multiple objective convex optimization. We develop a method to evaluate the quality of such a representation, and show by example that in multiple objective radiotherapy planning, the number of Pareto optimal solutions needed to represent Pareto surfaces of up to five dimensions grows at most linearly with the number of objectives. The method described is also applicable to the representation of convex sets. Copyright © 2009 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2009-01-01

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  1. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning.

    Science.gov (United States)

    Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2009-10-21

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  2. Effects of the financial crisis on the wealth distribution of Korea's companies

    Science.gov (United States)

    Lim, Kyuseong; Kim, Soo Yong; Swanson, Todd; Kim, Jooyun

    2017-02-01

    We investigated the distribution functions of Korea's top-rated companies during two financial crises. A power-law scaling for rank distribution, as well as cumulative probability distribution, was found and observed as a general pattern. Similar distributions can be shown in other studies of wealth and income distributions. In our study, the Pareto exponents designating the distribution differed before and after the crisis. The companies covered in this research are divided into two subgroups during a period when the subprime mortgage crisis occurred. Various industrial sectors of Korea's companies were found to respond differently during the two financial crises, especially the construction sector, financial sectors, and insurance groups.

  3. Analysis of the same day of the week increases in peak electricity ...

    African Journals Online (AJOL)

    Modelling of the same day of the week increases in peak electricity demand improves the reliability of a power network if an accurate assessment of the level and frequency of future extreme load forecasts is carried out. Key words: Gibbs sampling, generalized single pareto, generalized pareto distribution, pareto quantile ...

  4. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    Energy Technology Data Exchange (ETDEWEB)

    Bhunia, Uttam, E-mail: ubhunia@vecc.gov.in; Saha, Subimal; Chakrabarti, Alok

    2014-10-15

    Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.

  5. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    International Nuclear Information System (INIS)

    Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok

    2014-01-01

    Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy

  6. The Transmuted Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Faton Merovci

    2014-05-01

    Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.

  7. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.

    Science.gov (United States)

    Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric

    2010-07-20

    Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.

  8. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  9. Evaluation of a compound distribution based on weather pattern subsampling for extreme rainfall in Norway

    Directory of Open Access Journals (Sweden)

    J. Blanchet

    2015-12-01

    SCHADEX method for extreme flood estimation. Regional scores of evaluation are used in a split sample framework to compare the MEWP distribution with more general heavy-tailed distributions, in this case the Multi Generalized Pareto Weather Pattern (MGPWP distribution. The analysis shows the clear benefit obtained from seasonal and weather pattern-based subsampling for extreme value estimation. The MEWP distribution is found to have an overall better performance as compared with the MGPWP, which tends to overfit the data and lacks robustness. Finally, we take advantage of the split sample framework to present evidence for an increase in extreme rainfall in the southwestern part of Norway during the period 1979–2009, relative to 1948–1978.

  10. On quasistability radius of a vector trajectorial problem with a principle of optimality generalizing Pareto and lexicographic principles

    Directory of Open Access Journals (Sweden)

    Sergey E. Bukhtoyarov

    2005-05-01

    Full Text Available A multicriterion linear combinatorial problem with a parametric principle of optimality is considered. This principle is defined by a partitioning of partial criteria onto Pareto preference relation groups within each group and the lexicographic preference relation between them. Quasistability of the problem is investigated. This type of stability is a discrete analog of Hausdorff lower semi-continuity of the multiple-valued mapping that defines the choice function. A formula of quasistability radius is derived for the case of the metric l∞. Some known results are stated as corollaries. Mathematics Subject Classification 2000: 90C05, 90C10, 90C29, 90C31.

  11. Efficient approximation of black-box functions and Pareto sets

    NARCIS (Netherlands)

    Rennen, G.

    2009-01-01

    In the case of time-consuming simulation models or other so-called black-box functions, we determine a metamodel which approximates the relation between the input- and output-variables of the simulation model. To solve multi-objective optimization problems, we approximate the Pareto set, i.e. the

  12. Pareto-optimal alloys

    DEFF Research Database (Denmark)

    Bligaard, Thomas; Johannesson, Gisli Holmar; Ruban, Andrei

    2003-01-01

    Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties and the ......Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties...... and the cost. In this letter we present a database consisting of the lattice parameters, bulk moduli, and heats of formation for over 64 000 ordered metallic alloys, which has been established by direct first-principles density-functional-theory calculations. Furthermore, we use a concept from economic theory......, the Pareto-optimal set, to determine optimal alloy solutions for the compromise between low compressibility, high stability, and cost....

  13. Minimizing Harmonic Distortion Impact at Distribution System with Considering Large-Scale EV Load Behaviour Using Modified Lightning Search Algorithm and Pareto-Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    S. N. Syed Nasir

    2018-01-01

    Full Text Available This research is focusing on optimal placement and sizing of multiple variable passive filter (VPF to mitigate harmonic distortion due to charging station (CS at 449 bus distribution network. There are 132 units of CS which are scheduled based on user behaviour within 24 hours, with the interval of 15 minutes. By considering the varying of CS patterns and harmonic impact, Modified Lightning Search Algorithm (MLSA is used to find 22 units of VPF coordination, so that less harmonics will be injected from 415 V bus to the medium voltage network and power loss is also reduced. Power system harmonic flow, VPF, CS, battery, and the analysis will be modelled in MATLAB/m-file platform. High Performance Computing (HPC is used to make simulation faster. Pareto-Fuzzy technique is used to obtain sizing of VPF from all nondominated solutions. From the result, the optimal placements and sizes of VPF are able to reduce the maximum THD for voltage and current and also the total apparent losses up to 39.14%, 52.5%, and 2.96%, respectively. Therefore, it can be concluded that the MLSA is suitable method to mitigate harmonic and it is beneficial in minimizing the impact of aggressive CS installation at distribution network.

  14. Undersampling power-law size distributions: effect on the assessment of extreme natural hazards

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2014-01-01

    The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and by attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historic data.

  15. Using the Pareto principle in genome-wide breeding value estimation.

    Science.gov (United States)

    Yu, Xijiang; Meuwissen, Theo H E

    2011-11-01

    Genome-wide breeding value (GWEBV) estimation methods can be classified based on the prior distribution assumptions of marker effects. Genome-wide BLUP methods assume a normal prior distribution for all markers with a constant variance, and are computationally fast. In Bayesian methods, more flexible prior distributions of SNP effects are applied that allow for very large SNP effects although most are small or even zero, but these prior distributions are often also computationally demanding as they rely on Monte Carlo Markov chain sampling. In this study, we adopted the Pareto principle to weight available marker loci, i.e., we consider that x% of the loci explain (100 - x)% of the total genetic variance. Assuming this principle, it is also possible to define the variances of the prior distribution of the 'big' and 'small' SNP. The relatively few large SNP explain a large proportion of the genetic variance and the majority of the SNP show small effects and explain a minor proportion of the genetic variance. We name this method MixP, where the prior distribution is a mixture of two normal distributions, i.e. one with a big variance and one with a small variance. Simulation results, using a real Norwegian Red cattle pedigree, show that MixP is at least as accurate as the other methods in all studied cases. This method also reduces the hyper-parameters of the prior distribution from 2 (proportion and variance of SNP with big effects) to 1 (proportion of SNP with big effects), assuming the overall genetic variance is known. The mixture of normal distribution prior made it possible to solve the equations iteratively, which greatly reduced computation loads by two orders of magnitude. In the era of marker density reaching million(s) and whole-genome sequence data, MixP provides a computationally feasible Bayesian method of analysis.

  16. Multiple Criteria Decision Making by Generalized Data Envelopment Analysis Introducing Aspiration Level Method

    International Nuclear Information System (INIS)

    Yun, Yeboon; Arakawa, Masao; Hiroshi, Ishikawa; Nakayama, Hirotaka

    2002-01-01

    It has been proved in problems with 2-objective functions that genetic algorithms (GAs) are well utilized for generating Pareto optimal solutions, and then decision making can be easily performed on the basis of visualized Pareto optimal solutions. However, GAs are difficult to visualize Pareto optimal solutions in cases in which the number of objective function is more than 4. Hence, it is trouble some to grasp the trade-off among many objective functions, and decision makers hesitate to choose a final solution from a number of Pareto optimal solutions. In order to solve these problems, we suggest an aspiration level approach to the method using the generalized data envelopment analysis and GAs. We show that the proposed method supports decision makers to choose their desirable solution from many Pareto optimal solutions. Furthermore, it will be seen that engineering design can be effectively done by the proposed method, which makes generation of several Pareto optimal solutions close to the aspiration level and trade-off analysis easily

  17. Diphoton generalized distribution amplitudes

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.

    2008-01-01

    We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.

  18. Solving multi-objective job shop problem using nature-based algorithms: new Pareto approximation features

    Directory of Open Access Journals (Sweden)

    Jarosław Rudy

    2015-01-01

    Full Text Available In this paper the job shop scheduling problem (JSP with minimizing two criteria simultaneously is considered. JSP is frequently used model in real world applications of combinatorial optimization. Multi-objective job shop problems (MOJSP were rarely studied. We implement and compare two multi-agent nature-based methods, namely ant colony optimization (ACO and genetic algorithm (GA for MOJSP. Both of those methods employ certain technique, taken from the multi-criteria decision analysis in order to establish ranking of solutions. ACO and GA differ in a method of keeping information about previously found solutions and their quality, which affects the course of the search. In result, new features of Pareto approximations provided by said algorithms are observed: aside from the slight superiority of the ACO method the Pareto frontier approximations provided by both methods are disjoint sets. Thus, both methods can be used to search mutually exclusive areas of the Pareto frontier.

  19. Optimal Placement and Sizing of Renewable Distributed Generations and Capacitor Banks into Radial Distribution Systems

    Directory of Open Access Journals (Sweden)

    Mahesh Kumar

    2017-06-01

    Full Text Available In recent years, renewable types of distributed generation in the distribution system have been much appreciated due to their enormous technical and environmental advantages. This paper proposes a methodology for optimal placement and sizing of renewable distributed generation(s (i.e., wind, solar and biomass and capacitor banks into a radial distribution system. The intermittency of wind speed and solar irradiance are handled with multi-state modeling using suitable probability distribution functions. The three objective functions, i.e., power loss reduction, voltage stability improvement, and voltage deviation minimization are optimized using advanced Pareto-front non-dominated sorting multi-objective particle swarm optimization method. First a set of non-dominated Pareto-front data are called from the algorithm. Later, a fuzzy decision technique is applied to extract the trade-off solution set. The effectiveness of the proposed methodology is tested on the standard IEEE 33 test system. The overall results reveal that combination of renewable distributed generations and capacitor banks are dominant in power loss reduction, voltage stability and voltage profile improvement.

  20. Computing gap free Pareto front approximations with stochastic search algorithms.

    Science.gov (United States)

    Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali

    2010-01-01

    Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.

  1. Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region

    Science.gov (United States)

    Chaudhary, Chhavi; Sharma, Mukat Lal

    2017-12-01

    Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.

  2. The use of generalized functions and distributions in general relativity

    International Nuclear Information System (INIS)

    Steinbauer, R; Vickers, J A

    2006-01-01

    We review the extent to which one can use classical distribution theory in describing solutions of Einstein's equations. We show that there are a number of physically interesting cases which cannot be treated using distribution theory but require a more general concept. We describe a mathematical theory of nonlinear generalized functions based on Colombeau algebras and show how this may be applied in general relativity. We end by discussing the concept of singularity in general relativity and show that certain solutions with weak singularities may be regarded as distributional solutions of Einstein's equations. (topical review)

  3. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    International Nuclear Information System (INIS)

    Meenachi, N. Madurai; Baba, M. Sai

    2017-01-01

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  4. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    Energy Technology Data Exchange (ETDEWEB)

    Meenachi, N. Madurai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Planning and Human Resource Management Div.; Baba, M. Sai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Resources Management Group

    2017-12-15

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  5. From microscopic taxation and redistribution models to macroscopic income distributions

    Science.gov (United States)

    Bertotti, Maria Letizia; Modanese, Giovanni

    2011-10-01

    We present here a general framework, expressed by a system of nonlinear differential equations, suitable for the modeling of taxation and redistribution in a closed society. This framework allows one to describe the evolution of income distribution over the population and to explain the emergence of collective features based on knowledge of the individual interactions. By making different choices of the framework parameters, we construct different models, whose long-time behavior is then investigated. Asymptotic stationary distributions are found, which enjoy similar properties as those observed in empirical distributions. In particular, they exhibit power law tails of Pareto type and their Lorenz curves and Gini indices are consistent with some real world ones.

  6. Statistical distributions of extreme dry spell in Peninsular Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Jemain, Abdul Aziz

    2010-11-01

    Statistical distributions of annual extreme (AE) series and partial duration (PD) series for dry-spell event are analyzed for a database of daily rainfall records of 50 rain-gauge stations in Peninsular Malaysia, with recording period extending from 1975 to 2004. The three-parameter generalized extreme value (GEV) and generalized Pareto (GP) distributions are considered to model both series. In both cases, the parameters of these two distributions are fitted by means of the L-moments method, which provides a robust estimation of them. The goodness-of-fit (GOF) between empirical data and theoretical distributions are then evaluated by means of the L-moment ratio diagram and several goodness-of-fit tests for each of the 50 stations. It is found that for the majority of stations, the AE and PD series are well fitted by the GEV and GP models, respectively. Based on the models that have been identified, we can reasonably predict the risks associated with extreme dry spells for various return periods.

  7. Approximations of the Generalized Wilks' Distribution

    NARCIS (Netherlands)

    Raats, V.M.

    2004-01-01

    Wilks' lambda and the corresponding Wilks' distribution are well known concepts in testing in multivariate regression models.The topic of this paper is a generalization of the Wilks distribution.This generalized Wilks' distribution is relevant for testing in multivariate regression models with

  8. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  9. Application of the Pareto chart and Ishikawa diagram for the identification of major defects in metal composite castings

    OpenAIRE

    K. Gawdzińska

    2011-01-01

    This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.

  10. Optimal allocation and adaptive VAR control of PV-DG in distribution networks

    International Nuclear Information System (INIS)

    Fu, Xueqian; Chen, Haoyong; Cai, Runqing; Yang, Ping

    2015-01-01

    Highlights: • A methodology for optimal PV-DG allocation based on a combination of algorithms. • Dealing with the randomicity of solar power energy using CCSP. • Presenting a VAR control strategy to balance the technical demands. • Finding the Pareto solutions using MOPSO and SVM. • Evaluating the Pareto solutions using WRSR. - Abstract: The development of distributed generation (DG) has brought new challenges to power networks. One of them that catches extensive attention is the voltage regulation problem of distribution networks caused by DG. Optimal allocation of DG in distribution networks is another well-known problem being widely investigated. This paper proposes a new method for the optimal allocation of photovoltaic distributed generation (PV-DG) considering the non-dispatchable characteristics of PV units. An adaptive reactive power control model is introduced in PV-DG allocation as to balance the trade-off between the improvement of voltage quality and the minimization of power loss in a distribution network integrated with PV-DG units. The optimal allocation problem is formulated as a chance-constrained stochastic programming (CCSP) model for dealing with the randomness of solar power energy. A novel algorithm combining the multi-objective particle swarm optimization (MOPSO) with support vector machines (SVM) is proposed to find the Pareto front consisting of a set of possible solutions. The Pareto solutions are further evaluated using the weighted rank sum ratio (WRSR) method to help the decision-maker obtain the desired solution. Simulation results on a 33-bus radial distribution system show that the optimal allocation method can fully take into account the time-variant characteristics and probability distribution of PV-DG, and obtain the best allocation scheme

  11. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    Science.gov (United States)

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  12. La narrazione dell’azione sociale: spunti dal Trattato di Vilfredo Pareto

    Directory of Open Access Journals (Sweden)

    Ilaria Riccioni

    2017-08-01

    Full Text Available La rilettura dei classici porta con sé sempre una duplice operazione: da una parte un ritorno a riflessioni, ritmi, storicità che spesso sembrano già superate; dall’altra la riscoperta delle origini di fenomeni contemporanei da punti di vista che ne delineano le interconnessioni profonde, non più visibili allo stato di avanzamento in cui le osserviamo oggi. Tale maggiore chiarezza è forse dovuta al fatto che ogni fenomeno nella sua fase aurorale è più chiaramente identificabile rispetto alle sue fasi successive, dove le caratteristiche primarie tendono a stemperarsi nelle cifre dominanti della contemporaneità, perdendosi nelle pratiche quotidiane che ne celano la provenienza. Se la sociologia è un processo di conoscenza della realtà dei fenomeni, il punto centrale della scienza sociale va distinto tra quelle scienze che schematizzano il reale in equazioni formali funzionali e funzionanti, il sistema economico, normativo, e le scienze sociali che si occupano della realtà e della sua complessità, che in quanto scienze si devono occupare non tanto di ciò che la realtà deve essere, bensì di ciò che la realtà è, di come si pone e di come manifesta i movimenti desideranti e profondi del vivere collettivo oltre il sistema che ne gestisce il funzionamento. Il punto che Pareto sembra scorgere, con estrema lucidità, è la necessità di ribaltare l’importanza della logica economica nell’organizzazione sociale da scienza che detta la realtà a scienza che propone uno schema di gestione di essa: da essa si cerca di dettare la realtà, ma l’economia, dal greco moderno Oikòs, Oikòsgeneia (casa e generazione, il termine utilizzato per definire l’unità famigliare non è di fatto “la realtà”, sembra dirci Pareto in più digressioni, bensì l’arte e la scienza della gestione di unità familiari e produttive. La realtà rimane in ombra e non può che essere “avvicinata” da una scienza che ne registri, ed eventualmente

  13. The Successor Function and Pareto Optimal Solutions of Cooperative Differential Systems with Concavity. I

    DEFF Research Database (Denmark)

    Andersen, Kurt Munk; Sandqvist, Allan

    1997-01-01

    We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution.......We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution....

  14. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  15. Application of the Pareto chart and Ishikawa diagram for the identification of major defects in metal composite castings

    Directory of Open Access Journals (Sweden)

    K. Gawdzińska

    2011-04-01

    Full Text Available This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.

  16. Approximating the Pareto set of multiobjective linear programs via robust optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    We consider problems with multiple linear objectives and linear constraints and use adjustable robust optimization and polynomial optimization as tools to approximate the Pareto set with polynomials of arbitrarily large degree. The main difference with existing techniques is that we optimize a

  17. Approximating the Pareto Set of Multiobjective Linear Programs via Robust Optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    Abstract: The Pareto set of a multiobjective optimization problem consists of the solutions for which one or more objectives can not be improved without deteriorating one or more other objectives. We consider problems with linear objectives and linear constraints and use Adjustable Robust

  18. Income- and energy-taxation for redistribution in general equilibrium

    International Nuclear Information System (INIS)

    FitzRoy, F.R.

    1993-01-01

    In a 3-factor General Equilibrium (GE)-model with a continuum of ability, the employed choose optimal labour supply, and equilibrium unemployment is determined by benefits funded by wage- and energy-taxes. Aggregate labour and the net wage may increase or decrease with taxation (and unemployment), and conditions for a reduction in redistributive wage-taxes to be Pareto-improving are derived. A small energy tax always raises the net wage, providing the wage tax is reduced to maintain constant employment and a balanced budget. High ability households prefer higher energy taxes when externalities are uniformly distributed and non-distorting. (author)

  19. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games - Replaced by CentER DP 2011-041

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2010-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for an N player cooperative infinite horizon differential game. Firstly, we write the problem of finding Pareto candidates as solving N constrained optimal control subproblems. We derive some

  20. Tapped density optimisation for four agricultural wastes - Part II: Performance analysis and Taguchi-Pareto

    Directory of Open Access Journals (Sweden)

    Ajibade Oluwaseyi Ayodele

    2016-01-01

    Full Text Available In this attempt, which is a second part of discussions on tapped density optimisation for four agricultural wastes (particles of coconut, periwinkle, palm kernel and egg shells, performance analysis for comparative basis is made. This paper pioneers a study direction in which optimisation of process variables are pursued using Taguchi method integrated with the Pareto 80-20 rule. Negative percentage improvements resulted when the optimal tapped density was compared with the average tapped density. However, the performance analysis between optimal tapped density and the peak tapped density values yielded positive percentage improvements for the four filler particles. The performance analysis results validate the effectiveness of using the Taguchi method in improving the tapped density properties of the filler particles. The application of the Pareto 80-20 rule to the table of parameters and levels produced revised tables of parameters and levels which helped to identify the factor-levels position of each parameter that is economical to optimality. The Pareto 80-20 rule also produced revised S/N response tables which were used to know the relevant S/N ratios that are relevant to optimality.

  1. Small Sample Robust Testing for Normality against Pareto Tails

    Czech Academy of Sciences Publication Activity Database

    Stehlík, M.; Fabián, Zdeněk; Střelec, L.

    2012-01-01

    Roč. 41, č. 7 (2012), s. 1167-1194 ISSN 0361-0918 Grant - others:Aktion(CZ-AT) 51p7, 54p21, 50p14, 54p13 Institutional research plan: CEZ:AV0Z10300504 Keywords : consistency * Hill estimator * t-Hill estimator * location functional * Pareto tail * power comparison * returns * robust tests for normality Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.295, year: 2012

  2. SU-F-J-105: Towards a Novel Treatment Planning Pipeline Delivering Pareto- Optimal Plans While Enabling Inter- and Intrafraction Plan Adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B [University Medical Center Utrecht, Utrecht (Netherlands); Breedveld, S; Sharfo, A; Heijmen, B [Erasmus University Medical Center Rotterdam, Rotterdam (Netherlands)

    2016-06-15

    Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan

  3. SU-F-J-105: Towards a Novel Treatment Planning Pipeline Delivering Pareto- Optimal Plans While Enabling Inter- and Intrafraction Plan Adaptation

    International Nuclear Information System (INIS)

    Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B; Breedveld, S; Sharfo, A; Heijmen, B

    2016-01-01

    Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan

  4. Accident investigation of construction sites in Qom city using Pareto chart (2009-2012

    Directory of Open Access Journals (Sweden)

    M. H. Beheshti

    2015-07-01

    .Conclusions: Employing Pareto charts as a method for analyzing and identification of accident causes can have an effective role in the management of work-related accidents, proper allocation of funds and time.

  5. The geometry of the Pareto front in biological phenotype space

    Science.gov (United States)

    Sheftel, Hila; Shoval, Oren; Mayo, Avi; Alon, Uri

    2013-01-01

    When organisms perform a single task, selection leads to phenotypes that maximize performance at that task. When organisms need to perform multiple tasks, a trade-off arises because no phenotype can optimize all tasks. Recent work addressed this question, and assumed that the performance at each task decays with distance in trait space from the best phenotype at that task. Under this assumption, the best-fitness solutions (termed the Pareto front) lie on simple low-dimensional shapes in trait space: line segments, triangles and other polygons. The vertices of these polygons are specialists at a single task. Here, we generalize this finding, by considering performance functions of general form, not necessarily functions that decay monotonically with distance from their peak. We find that, except for performance functions with highly eccentric contours, simple shapes in phenotype space are still found, but with mildly curving edges instead of straight ones. In a wide range of systems, complex data on multiple quantitative traits, which might be expected to fill a high-dimensional phenotype space, is predicted instead to collapse onto low-dimensional shapes; phenotypes near the vertices of these shapes are predicted to be specialists, and can thus suggest which tasks may be at play. PMID:23789060

  6. On Usage of Pareto curves to Select Wind Turbine Controller Tunings to the Wind Turbulence Level

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh

    2015-01-01

    Model predictive control has in recently publications shown its potential for lowering of cost of energy of modern wind turbines. Pareto curves can be used to evaluate performance of these controllers with multiple conflicting objectives of power and fatigue loads. In this paper an approach...... to update an model predictive wind turbine controller tuning as the wind turbulence increases, as increased turbulence levels results in higher loads for the same controller tuning. In this paper the Pareto curves are computed using an industrial high fidelity aero-elastic model. Simulations show...

  7. Pareto Optimization of a Half Car Passive Suspension Model Using a Novel Multiobjective Heat Transfer Search Algorithm

    Directory of Open Access Journals (Sweden)

    Vimal Savsani

    2017-01-01

    Full Text Available Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS algorithm, which is based on the search technique of heat transfer search (HTS algorithm. MOHTS employs the elitist nondominated sorting and crowding distance approach of an elitist based nondominated sorting genetic algorithm-II (NSGA-II for obtaining different nondomination levels and to preserve the diversity among the optimal set of solutions, respectively. The capability in yielding a Pareto front as close as possible to the true Pareto front of MOHTS has been tested on the multiobjective optimization problem of the vehicle suspension design, which has a set of five second-order linear ordinary differential equations. Half car passive ride model with two different sets of five objectives is employed for optimizing the suspension parameters using MOHTS and NSGA-II. The optimization studies demonstrate that MOHTS achieves the better nondominated Pareto front with the widespread (diveresed set of optimal solutions as compared to NSGA-II, and further the comparison of the extreme points of the obtained Pareto front reveals the dominance of MOHTS over NSGA-II, multiobjective uniform diversity genetic algorithm (MUGA, and combined PSO-GA based MOEA.

  8. McDonald Generalized Linear Failure Rate Distribution

    Directory of Open Access Journals (Sweden)

    Ibrahim Elbatal

    2014-10-01

    Full Text Available We introduce in this paper a new six-parameters generalized version of the generalized linear failure rate (GLFR distribution which is called McDonald Generalized Linear failure rate (McGLFR distribution. The new distribution is quite flexible and can be used effectively in modeling survival data and reliability problems. It can have a constant, decreasing, increasing, and upside down bathtub-and bathtub shaped failure rate function depending on its parameters. It includes some well-known lifetime distributions as special sub-models. Some structural properties of the new distribution are studied. Moreover we discuss maximum likelihood estimation of the unknown parameters of the new model.

  9. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array.

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-20

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  10. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives

    International Nuclear Information System (INIS)

    Warmflash, Aryeh; Siggia, Eric D; Francois, Paul

    2012-01-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input–output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria. (paper)

  11. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives.

    Science.gov (United States)

    Warmflash, Aryeh; Francois, Paul; Siggia, Eric D

    2012-10-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input-output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria.

  12. Using neural networks and extreme value distributions to model electricity pool prices: Evidence from the Australian National Electricity Market 1998–2013

    International Nuclear Information System (INIS)

    Dev, Priya; Martin, Michael A.

    2014-01-01

    Highlights: • Neural nets are unable to properly capture spiky price behavior found in the electricity market. • We modeled electricity price data from the Australian National Electricity Market over 15 years. • Neural nets need to be augmented with other modeling techniques to capture price spikes. • We fit a Generalized Pareto Distribution to price spikes using a peaks-over-thresholds approach. - Abstract: Competitors in the electricity supply industry desire accurate predictions of electricity spot prices to hedge against financial risks. Neural networks are commonly used for forecasting such prices, but certain features of spot price series, such as extreme price spikes, present critical challenges for such modeling. We investigate the predictive capacity of neural networks for electricity spot prices using Australian National Electricity Market data. Following neural net modeling of the data, we explore extreme price spikes through extreme value modeling, fitting a Generalized Pareto Distribution to price peaks over an estimated threshold. While neural nets capture the smoother aspects of spot price data, they are unable to capture local, volatile features that characterize electricity spot price data. Price spikes can be modeled successfully through extreme value modeling

  13. Characterization of distributions by conditional expectation of record values

    Directory of Open Access Journals (Sweden)

    A.H. Khan

    2016-01-01

    Full Text Available A family of continuous probability distributions has been characterized by two conditional expectations of record statistics conditioned on a non-adjacent record value. Besides various deductions, this work extends the result of Lee [8] in which Pareto distribution has been characterized.

  14. A clinical distance measure for evaluating treatment plan quality difference with Pareto fronts in radiotherapy

    Directory of Open Access Journals (Sweden)

    Kristoffer Petersson

    2017-07-01

    Full Text Available We present a clinical distance measure for Pareto front evaluation studies in radiotherapy, which we show strongly correlates (r = 0.74 and 0.90 with clinical plan quality evaluation. For five prostate cases, sub-optimal treatment plans located at a clinical distance value of >0.32 (0.28–0.35 from fronts of Pareto optimal plans, were assessed to be of lower plan quality by our (12 observers (p < .05. In conclusion, the clinical distance measure can be used to determine if the difference between a front and a given plan (or between different fronts corresponds to a clinically significant plan quality difference.

  15. A κ-generalized statistical mechanics approach to income analysis

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  16. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  17. The Reduction of Modal Sensor Channels through a Pareto Chart Methodology

    Directory of Open Access Journals (Sweden)

    Kaci J. Lemler

    2015-01-01

    Full Text Available Presented herein is a new experimental sensor placement procedure developed to assist in placing sensors in key locations in an efficient method to reduce the number of channels for a full modal analysis. It is a fast, noncontact method that uses a laser vibrometer to gather a candidate set of sensor locations. These locations are then evaluated using a Pareto chart to obtain a reduced set of sensor locations that still captures the motion of the structure. The Pareto chart is employed to identify the points on a structure that have the largest reaction to an input excitation and thus reduce the number of channels while capturing the most significant data. This method enhances the correct and efficient placement of sensors which is crucial in modal testing. Previously this required the development and/or use of a complicated model or set of equations. This new technique is applied in a case study on a small unmanned aerial system. The test procedure is presented and the results are discussed.

  18. Application of the Pareto principle to identify and address drug-therapy safety issues.

    Science.gov (United States)

    Müller, Fabian; Dormann, Harald; Pfistermeister, Barbara; Sonst, Anja; Patapovas, Andrius; Vogler, Renate; Hartmann, Nina; Plank-Kiegele, Bettina; Kirchner, Melanie; Bürkle, Thomas; Maas, Renke

    2014-06-01

    Adverse drug events (ADE) and medication errors (ME) are common causes of morbidity in patients presenting at emergency departments (ED). Recognition of ADE as being drug related and prevention of ME are key to enhancing pharmacotherapy safety in ED. We assessed the applicability of the Pareto principle (~80 % of effects result from 20 % of causes) to address locally relevant problems of drug therapy. In 752 cases consecutively admitted to the nontraumatic ED of a major regional hospital, ADE, ME, contributing drugs, preventability, and detection rates of ADE by ED staff were investigated. Symptoms, errors, and drugs were sorted by frequency in order to apply the Pareto principle. In total, 242 ADE were observed, and 148 (61.2 %) were assessed as preventable. ADE contributed to 110 inpatient hospitalizations. The ten most frequent symptoms were causally involved in 88 (80.0 %) inpatient hospitalizations. Only 45 (18.6 %) ADE were recognized as drug-related problems until discharge from the ED. A limited set of 33 drugs accounted for 184 (76.0 %) ADE; ME contributed to 57 ADE. Frequency-based listing of ADE, ME, and drugs involved allowed identification of the most relevant problems and development of easily to implement safety measures, such as wall and pocket charts. The Pareto principle provides a method for identifying the locally most relevant ADE, ME, and involved drugs. This permits subsequent development of interventions to increase patient safety in the ED admission process that best suit local needs.

  19. Global WASF-GA: An Evolutionary Algorithm in Multiobjective Optimization to Approximate the Whole Pareto Optimal Front.

    Science.gov (United States)

    Saborido, Rubén; Ruiz, Ana B; Luque, Mariano

    2017-01-01

    In this article, we propose a new evolutionary algorithm for multiobjective optimization called Global WASF-GA ( global weighting achievement scalarizing function genetic algorithm), which falls within the aggregation-based evolutionary algorithms. The main purpose of Global WASF-GA is to approximate the whole Pareto optimal front. Its fitness function is defined by an achievement scalarizing function (ASF) based on the Tchebychev distance, in which two reference points are considered (both utopian and nadir objective vectors) and the weight vector used is taken from a set of weight vectors whose inverses are well-distributed. At each iteration, all individuals are classified into different fronts. Each front is formed by the solutions with the lowest values of the ASF for the different weight vectors in the set, using the utopian vector and the nadir vector as reference points simultaneously. Varying the weight vector in the ASF while considering the utopian and the nadir vectors at the same time enables the algorithm to obtain a final set of nondominated solutions that approximate the whole Pareto optimal front. We compared Global WASF-GA to MOEA/D (different versions) and NSGA-II in two-, three-, and five-objective problems. The computational results obtained permit us to conclude that Global WASF-GA gets better performance, regarding the hypervolume metric and the epsilon indicator, than the other two algorithms in many cases, especially in three- and five-objective problems.

  20. A Pareto Algorithm for Efficient De Novo Design of Multi-functional Molecules.

    Science.gov (United States)

    Daeyaert, Frits; Deem, Micheal W

    2017-01-01

    We have introduced a Pareto sorting algorithm into Synopsis, a de novo design program that generates synthesizable molecules with desirable properties. We give a detailed description of the algorithm and illustrate its working in 2 different de novo design settings: the design of putative dual and selective FGFR and VEGFR inhibitors, and the successful design of organic structure determining agents (OSDAs) for the synthesis of zeolites. We show that the introduction of Pareto sorting not only enables the simultaneous optimization of multiple properties but also greatly improves the performance of the algorithm to generate molecules with hard-to-meet constraints. This in turn allows us to suggest approaches to address the problem of false positive hits in de novo structure based drug design by introducing structural and physicochemical constraints in the designed molecules, and by forcing essential interactions between these molecules and their target receptor. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Single Cell Dynamics Causes Pareto-Like Effect in Stimulated T Cell Populations.

    Science.gov (United States)

    Cosette, Jérémie; Moussy, Alice; Onodi, Fanny; Auffret-Cariou, Adrien; Neildez-Nguyen, Thi My Anh; Paldi, Andras; Stockholm, Daniel

    2015-12-09

    Cell fate choice during the process of differentiation may obey to deterministic or stochastic rules. In order to discriminate between these two strategies we used time-lapse microscopy of individual murine CD4 + T cells that allows investigating the dynamics of proliferation and fate commitment. We observed highly heterogeneous division and death rates between individual clones resulting in a Pareto-like dominance of a few clones at the end of the experiment. Commitment to the Treg fate was monitored using the expression of a GFP reporter gene under the control of the endogenous Foxp3 promoter. All possible combinations of proliferation and differentiation were observed and resulted in exclusively GFP-, GFP+ or mixed phenotype clones of very different population sizes. We simulated the process of proliferation and differentiation using a simple mathematical model of stochastic decision-making based on the experimentally observed parameters. The simulations show that a stochastic scenario is fully compatible with the observed Pareto-like imbalance in the final population.

  2. Birds shed RNA-viruses according to the pareto principle.

    Directory of Open Access Journals (Sweden)

    Mark D Jankowski

    Full Text Available A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian - pathogen (RNA-virus studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality was 0.687 (0.036 SEM, and that 22.0% (0.90 SEM of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.

  3. Birds shed RNA-viruses according to the pareto principle.

    Science.gov (United States)

    Jankowski, Mark D; Williams, Christopher J; Fair, Jeanne M; Owen, Jennifer C

    2013-01-01

    A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian) - pathogen (RNA-virus) studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality) was 0.687 (0.036 SEM), and that 22.0% (0.90 SEM) of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.

  4. Distributing Correlation Coefficients of Linear Structure-Activity/Property Models

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACA

    2011-12-01

    Full Text Available Quantitative structure-activity/property relationships are mathematical relationships linking chemical structure and activity/property in a quantitative manner. These in silico approaches are frequently used to reduce animal testing and risk-assessment, as well as to increase time- and cost-effectiveness in characterization and identification of active compounds. The aim of our study was to investigate the pattern of correlation coefficients distribution associated to simple linear relationships linking the compounds structure with their activities. A set of the most common ordnance compounds found at naval facilities with a limited data set with a range of toxicities on aquatic ecosystem and a set of seven properties was studied. Statistically significant models were selected and investigated. The probability density function of the correlation coefficients was investigated using a series of possible continuous distribution laws. Almost 48% of the correlation coefficients proved fit Beta distribution, 40% fit Generalized Pareto distribution, and 12% fit Pert distribution.

  5. Size-biased distributions in the generalized beta distribution family, with applications to forestry

    Science.gov (United States)

    Mark J. Ducey; Jeffrey H. Gove

    2015-01-01

    Size-biased distributions arise in many forestry applications, as well as other environmental, econometric, and biomedical sampling problems. We examine the size-biased versions of the generalized beta of the first kind, generalized beta of the second kind and generalized gamma distributions. These distributions include, as special cases, the Dagum (Burr Type III),...

  6. Applying Pareto multi-criteria decision making in concurrent engineering: A case study of polyethylene industry

    Directory of Open Access Journals (Sweden)

    Akbar A. Tabriz

    2011-07-01

    Full Text Available Concurrent engineering (CE is one of the widest known techniques for simultaneous planning of product and process design. In concurrent engineering, design processes are often complicated with multiple conflicting criteria and discrete sets of feasible alternatives. Thus multi-criteria decision making (MCDM techniques are integrated into CE to perform concurrent design. This paper proposes a design framework governed by MCDM technique, which are in conflict in the sense of competing for common resources to achieve variously different performance objectives such as financial, functional, environmental, etc. The Pareto MCDM model is applied to polyethylene pipe concurrent design governed by four criteria to determine the best alternative design to Pareto-compromise design.

  7. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    Science.gov (United States)

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Weighted Lomax distribution.

    Science.gov (United States)

    Kilany, N M

    2016-01-01

    The Lomax distribution (Pareto Type-II) is widely applicable in reliability and life testing problems in engineering as well as in survival analysis as an alternative distribution. In this paper, Weighted Lomax distribution is proposed and studied. The density function and its behavior, moments, hazard and survival functions, mean residual life and reversed failure rate, extreme values distributions and order statistics are derived and studied. The parameters of this distribution are estimated by the method of moments and the maximum likelihood estimation method and the observed information matrix is derived. Moreover, simulation schemes are derived. Finally, an application of the model to a real data set is presented and compared with some other well-known distributions.

  9. Classification as clustering: a Pareto cooperative-competitive GP approach.

    Science.gov (United States)

    McIntyre, Andrew R; Heywood, Malcolm I

    2011-01-01

    Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.

  10. Pareto-Optimization of HTS CICC for High-Current Applications in Self-Field

    Directory of Open Access Journals (Sweden)

    Giordano Tomassetti

    2018-01-01

    Full Text Available The ENEA superconductivity laboratory developed a novel design for Cable-in-Conduit Conductors (CICCs comprised of stacks of 2nd-generation REBCO coated conductors. In its original version, the cable was made up of 150 HTS tapes distributed in five slots, twisted along an aluminum core. In this work, taking advantage of a 2D finite element model, able to estimate the cable’s current distribution in the cross-section, a multiobjective optimization procedure was implemented. The aim of optimization was to simultaneously maximize both engineering current density and total current flowing inside the tapes when operating in self-field, by varying the cross-section layout. Since the optimization process involved both integer and real geometrical variables, the choice of an evolutionary search algorithm was strictly necessary. The use of an evolutionary algorithm in the frame of a multiple objective optimization made it an obliged choice to numerically approach the problem using a nonstandard fast-converging optimization algorithm. By means of this algorithm, the Pareto frontiers for the different configurations were calculated, providing a powerful tool for the designer to achieve the desired preliminary operating conditions in terms of engineering current density and/or total current, depending on the specific application field, that is, power transmission cable and bus bar systems.

  11. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    International Nuclear Information System (INIS)

    Gharari, Rahman; Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi

    2016-01-01

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor

  12. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gharari, Rahman [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi [Nuclear Engineering Dept, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)

    2016-10-15

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.

  13. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2011-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for infinite horizon cooperative differential games. We consider games defined by non autonomous and discounted autonomous systems. The obtained results are used to analyze the regular

  14. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  15. Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-09-01

    Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.

  16. Multi-objective genetic algorithm optimization of 2D- and 3D-Pareto fronts for vibrational quantum processes

    International Nuclear Information System (INIS)

    Gollub, C; De Vivie-Riedle, R

    2009-01-01

    A multi-objective genetic algorithm is applied to optimize picosecond laser fields, driving vibrational quantum processes. Our examples are state-to-state transitions and unitary transformations. The approach allows features of the shaped laser fields and of the excitation mechanisms to be controlled simultaneously with the quantum yield. Within the parameter range accessible to the experiment, we focus on short pulse durations and low pulse energies to optimize preferably robust laser fields. Multidimensional Pareto fronts for these conflicting objectives could be constructed. Comparison with previous work showed that the solutions from Pareto optimizations and from optimal control theory match very well.

  17. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    Science.gov (United States)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  18. Unraveling hadron structure with generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Andrei Belitsky; Anatoly Radyushkin

    2004-10-01

    The recently introduced generalized parton distributions have emerged as a universal tool to describe hadrons in terms of quark and gluonic degrees of freedom. They combine the features of form factors, parton densities and distribution amplitudes - the functions used for a long time in studies of hadronic structure. Generalized parton distributions are analogous to the phase-space Wigner quasi-probability function of non-relativistic quantum mechanics which encodes full information on a quantum-mechanical system. We give an extensive review of main achievements in the development of this formalism. We discuss physical interpretation and basic properties of generalized parton distributions, their modeling and QCD evolution in the leading and next-to-leading orders. We describe how these functions enter a wide class of exclusive reactions, such as electro- and photo-production of photons, lepton pairs, or mesons.

  19. Evolutionary tradeoffs, Pareto optimality and the morphology of ammonite shells.

    Science.gov (United States)

    Tendler, Avichai; Mayo, Avraham; Alon, Uri

    2015-03-07

    Organisms that need to perform multiple tasks face a fundamental tradeoff: no design can be optimal at all tasks at once. Recent theory based on Pareto optimality showed that such tradeoffs lead to a highly defined range of phenotypes, which lie in low-dimensional polyhedra in the space of traits. The vertices of these polyhedra are called archetypes- the phenotypes that are optimal at a single task. To rigorously test this theory requires measurements of thousands of species over hundreds of millions of years of evolution. Ammonoid fossil shells provide an excellent model system for this purpose. Ammonoids have a well-defined geometry that can be parameterized using three dimensionless features of their logarithmic-spiral-shaped shells. Their evolutionary history includes repeated mass extinctions. We find that ammonoids fill out a pyramid in morphospace, suggesting five specific tasks - one for each vertex of the pyramid. After mass extinctions, surviving species evolve to refill essentially the same pyramid, suggesting that the tasks are unchanging. We infer putative tasks for each archetype, related to economy of shell material, rapid shell growth, hydrodynamics and compactness. These results support Pareto optimality theory as an approach to study evolutionary tradeoffs, and demonstrate how this approach can be used to infer the putative tasks that may shape the natural selection of phenotypes.

  20. New generalized functions and multiplication of distributions

    International Nuclear Information System (INIS)

    Colombeau, J.F.

    1984-01-01

    Since its conception, Quantum Field Theory is based on 'heuristic' computations (in particular products of distributions) that, despite lots of effort, remained meaningless from a mathematical viewpoint. In this book the author presents a new mathematical theory giving a rigorous mathematical sense to these heuristic computations and, from a mathematical viewpoint, to all products of distributions. This new mathematical theory is a new theory of Generalized Functions defined on any open subset Ω of Rsup(n), which are much more general than the distributions on Ω. (Auth.)

  1. Efficient Intra-household Allocations and Distribution Factors

    DEFF Research Database (Denmark)

    Browning, Martin; Bourguignon, François; Chiappori, Pierre-André

    and general test of the Pareto efficiency hypothesis, which is consistent with all possible assumptions on the private or public nature of goods, all possible consumption externalities between household members, and all types of interdependent individual preferences and domestic production technology...

  2. Modelling and Pareto optimization of mechanical properties of friction stir welded AA7075/AA5083 butt joints using neural network and particle swarm algorithm

    International Nuclear Information System (INIS)

    Shojaeefard, Mohammad Hasan; Behnagh, Reza Abdi; Akbari, Mostafa; Givi, Mohammad Kazem Besharati; Farhani, Foad

    2013-01-01

    Highlights: ► Defect-free friction stir welds have been produced for AA5083-O/AA7075-O. ► Back-propagation was sufficient for predicting hardness and tensile strength. ► A hybrid multi-objective algorithm is proposed to deal with this MOP. ► Multi-objective particle swarm optimization was used to find the Pareto solutions. ► TOPSIS is used to rank the given alternatives of the Pareto solutions. -- Abstract: Friction Stir Welding (FSW) has been successfully used to weld similar and dissimilar cast and wrought aluminium alloys, especially for aircraft aluminium alloys, that generally present with low weldability by the traditional fusion welding process. This paper focuses on the microstructural and mechanical properties of the Friction Stir Welding (FSW) of AA7075-O to AA5083-O aluminium alloys. Weld microstructures, hardness and tensile properties were evaluated in as-welded condition. Tensile tests indicated that mechanical properties of the joint were better than in the base metals. An Artificial Neural Network (ANN) model was developed to simulate the correlation between the Friction Stir Welding parameters and mechanical properties. Performance of the ANN model was excellent and the model was employed to predict the ultimate tensile strength and hardness of butt joint of AA7075–AA5083 as functions of weld and rotational speeds. The multi-objective particle swarm optimization was used to obtain the Pareto-optimal set. Finally, the Technique for Order Preference by Similarity to the Ideal Solution (TOPSIS) was applied to determine the best compromised solution.

  3. Distributions of Journal Citations in Small Collections of Reading Research.

    Science.gov (United States)

    Mayes, Bea

    The distribution of reading-research citations was investigated in three populations of journals. The rule of Pareto-like distribution was confirmed as appropriate for determining the number of journals that would contribute half the citations in populations of 26 to 112 journals. In populations of 42 to 112 journals, 24% to 29% of the…

  4. Optimal PMU Placement with Uncertainty Using Pareto Method

    Directory of Open Access Journals (Sweden)

    A. Ketabi

    2012-01-01

    Full Text Available This paper proposes a method for optimal placement of Phasor Measurement Units (PMUs in state estimation considering uncertainty. State estimation has first been turned into an optimization exercise in which the objective function is selected to be the number of unobservable buses which is determined based on Singular Value Decomposition (SVD. For the normal condition, Differential Evolution (DE algorithm is used to find the optimal placement of PMUs. By considering uncertainty, a multiobjective optimization exercise is hence formulated. To achieve this, DE algorithm based on Pareto optimum method has been proposed here. The suggested strategy is applied on the IEEE 30-bus test system in several case studies to evaluate the optimal PMUs placement.

  5. On Generalized Type 1 Logistic Distribution | Ahsanullah | Afrika ...

    African Journals Online (AJOL)

    Some distributional properties of the generalized type 1 logistic distribution are given. Based on these distributional property a characterization of this distribution is presented. Key words: Conditional Expectation; Reversed Hazard Rate; Characterization.

  6. Transmuted New Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Shuaib Khan

    2017-06-01

    Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.

  7. Role of selective interaction in wealth distribution

    International Nuclear Information System (INIS)

    Gupta, A.K.

    2005-08-01

    In our simplified description 'money' is wealth. A kinetic theory model of money is investigated where two agents interact (trade) selectively and exchange random amount of money between them while keeping total money of all the agents constant. The probability distributions of individual money (P(m) vs. m) is seen to be influenced by certain modes of selective interactions. The distributions shift away from Boltzmann-Gibbs like exponential distribution and in some cases distributions emerge with power law tails known as Pareto's law (P(m) ∝ m -(1+α) ). (author)

  8. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  9. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  10. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  11. From form factors to generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, Markus

    2013-06-15

    I present an extraction of generalized parton distributions from selected data on the electromagnetic nucleon form factors. The extracted distributions can in particular be used to quantify the contribution to the proton spin from the total angular momentum carried by valence quarks, as well as their transverse spatial distribution inside the proton.

  12. Optimal Reinsurance Design for Pareto Optimum: From the Perspective of Multiple Reinsurers

    Directory of Open Access Journals (Sweden)

    Xing Rong

    2016-01-01

    Full Text Available This paper investigates optimal reinsurance strategies for an insurer which cedes the insured risk to multiple reinsurers. Assume that the insurer and every reinsurer apply the coherent risk measures. Then, we find out the necessary and sufficient conditions for the reinsurance market to achieve Pareto optimum; that is, every ceded-loss function and the retention function are in the form of “multiple layers reinsurance.”

  13. Project management under uncertainty beyond beta: The generalized bicubic distribution

    Directory of Open Access Journals (Sweden)

    José García Pérez

    2016-01-01

    Full Text Available The beta distribution has traditionally been employed in the PERT methodology and generally used for modeling bounded continuous random variables based on expert’s judgment. The impossibility of estimating four parameters from the three values provided by the expert when the beta distribution is assumed to be the underlying distribution has been widely debated. This paper presents the generalized bicubic distribution as a good alternative to the beta distribution since, when the variance depends on the mode, the generalized bicubic distribution approximates the kurtosis of the Gaussian distribution better than the beta distribution. In addition, this distribution presents good properties in the PERT methodology in relation to moderation and conservatism criteria. Two empirical applications are presented to demonstrate the adequateness of this new distribution.

  14. JOB SHOP SCHEDULING BIOBJETIVO MEDIANTE ENFRIAMIENTO SIMULADO Y ENFOQUE DE PARETO JOB-SHOP SCHEDULING: BIO-OBJECTIVE THROUGH SIMULATED COOLING AND PARETO PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Juan Carlos Osorio

    2012-12-01

    Full Text Available El problema del scheduling es uno de los problemas más ampliamente tratados en la literatura; sin embargo, es un problema complejo NP hard. Cuando, además, se involucra más de un objetivo, este problema se convierte en uno de los más complejos en el campo de la investigación de operaciones. Se presenta entonces un modelo biobjetivo para el job shop scheduling que incluye el makespan y el tiempo de flujo medio. Para resolver el modelo se ha utilizado una propuesta que incluye el uso del meta-heurístico Recocido Simulado (SA y el enfoque de Pareto. Este modelo es evaluado en tres problemas presentados en la literatura de tamaños 6x6, 10x5 y 10x10. Los resultados del modelo se comparan con otros meta-heurísticos y se encuentra que este modelo presenta buenos resultados en los tres problemas evaluados.The scheduling problem is one of the most widely treated problems in literature; however, it is an NP hard complex problem. Also, when more than one objective is involved, this problem becomes one of the most complex ones in the field of operations research. A bio-objective model is then emerged for the Job-Shop Scheduling, including makespan and mean flow time. For solving the model a proposal which includes the use of Simulated Annealing (SA metaheuristic and Pareto Principle. This model is evaluated in three problems described in literature with the following sizes: 6x6, 10x5 and 10x10. Results of the model are compared to other metaheuristics and it has been found that this model shows good results in the three problems evaluated.

  15. Multiobjective optimization of the inspection intervals of a nuclear safety system: A clustering-based framework for reducing the Pareto Front

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2010-01-01

    In this paper, a framework is developed for identifying a limited number of representative solutions of a multiobjective optimization problem concerning the inspection intervals of the components of a safety system of a nuclear power plant. Pareto Front solutions are first clustered into 'families', which are then synthetically represented by a 'head of the family' solution. Three clustering methods are analyzed. Level Diagrams are then used to represent, analyse and interpret the Pareto Fronts reduced to their head-of-the-family solutions. Two decision situations are considered: without or with decision maker preferences, the latter implying the introduction of a scoring system to rank the solutions with respect to the different objectives: a fuzzy preference assignment is then employed to this purpose. The results of the application of the framework of analysis to the problem of optimizing the inspection intervals of a nuclear power plant safety system show that the clustering-based reduction maintains the Pareto Front shape and relevant characteristics, while making it easier for the decision maker to select the final solution.

  16. Multi-fractal measures of city-size distributions based on the three-parameter Zipf model

    International Nuclear Information System (INIS)

    Chen Yanguang; Zhou Yixing

    2004-01-01

    A multi-fractal framework of urban hierarchies is presented to address the rank-size distribution of cities. The three-parameter Zipf model based on a pair of exponential-type scaling laws is generalized to multi-scale fractal measures. Then according to the equivalent relationship between Zipf's law and Pareto distribution, a set of multi-fractal equations are derived using dual conversion and the Legendre transform. The US city population data coming from the 2000 census are employed to verify the multi-fractal models and the results are satisfying. The multi-fractal measures reveal some strange symmetry regularity of urban systems. While explaining partially the remains of the hierarchical step-like frequency distribution of city sizes suggested by central place theory, the mathematical framework can be interpreted with the entropy-maximizing principle and some related ideas from self-organization

  17. Mapping the Pareto optimal design space for a functionally deimmunized biotherapeutic candidate.

    Science.gov (United States)

    Salvat, Regina S; Parker, Andrew S; Choi, Yoonjoo; Bailey-Kellogg, Chris; Griswold, Karl E

    2015-01-01

    The immunogenicity of biotherapeutics can bottleneck development pipelines and poses a barrier to widespread clinical application. As a result, there is a growing need for improved deimmunization technologies. We have recently described algorithms that simultaneously optimize proteins for both reduced T cell epitope content and high-level function. In silico analysis of this dual objective design space reveals that there is no single global optimum with respect to protein deimmunization. Instead, mutagenic epitope deletion yields a spectrum of designs that exhibit tradeoffs between immunogenic potential and molecular function. The leading edge of this design space is the Pareto frontier, i.e. the undominated variants for which no other single design exhibits better performance in both criteria. Here, the Pareto frontier of a therapeutic enzyme has been designed, constructed, and evaluated experimentally. Various measures of protein performance were found to map a functional sequence space that correlated well with computational predictions. These results represent the first systematic and rigorous assessment of the functional penalty that must be paid for pursuing progressively more deimmunized biotherapeutic candidates. Given this capacity to rapidly assess and design for tradeoffs between protein immunogenicity and functionality, these algorithms may prove useful in augmenting, accelerating, and de-risking experimental deimmunization efforts.

  18. A life cycle multi-objective economic and environmental assessment of distributed generation in buildings

    International Nuclear Information System (INIS)

    Safaei, Amir; Freire, Fausto; Henggeler Antunes, Carlos

    2015-01-01

    Highlights: • A lifecycle optimization model for distributed energy systems is developed. • Model estimates costs and environmental impacts of meeting the building energy demand. • Design and operating strategies to reduce costs and environmental impacts are discussed. • Pareto frontiers of costs vis-à-vis environmental impacts are presented. • Distributed generation can reduce the environmental impacts of the building sector. - Abstract: Distributed generation, namely cogeneration and solar technologies, is expected to play an important role in the future energy supply mix in buildings. This calls for a methodological framework to assess the economic and environmental performance of the building sector when such technologies are employed. A life-cycle model has been developed, combining distributed generation and conventional sources to calculate the cost and environmental impacts of meeting the building energy demand over a defined planning period. Three type of cogeneration technologies, solar photovoltaic and thermal, as well as conventional boilers along with the Portuguese electricity generation mix comprise the energy systems modeled. Pareto optimal frontiers are derived, showing the trade-offs between different types of impacts (non-renewable cumulative energy demand, greenhouse gas emissions, acidification, eutrophication) and cost to meet the energy demand of a commercial building. Our analysis shows that according to the objective to employ distributed generation (reducing cost or environmental impacts), a specific design and operational strategy for the energy systems shall be adopted. The strategies to minimize each type of impact and the associated cost trade-offs by exploring the solutions located on the Pareto optimal frontiers are discussed

  19. Pareto-path multitask multiple kernel learning.

    Science.gov (United States)

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-01-01

    A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.

  20. Nucleon generalized parton distributions from full lattice QCD

    International Nuclear Information System (INIS)

    Haegler, P.; Schroers, W.; Bratt, J.; Negele, J.W.; Pochinsky, A.V.

    2007-07-01

    We present a comprehensive study of the lowest moments of nucleon generalized parton distributions in N f =2+1 lattice QCD using domain wall valence quarks and improved staggered sea quarks. Our investigation includes helicity dependent and independent generalized parton distributions for pion masses as low as 350 MeV and volumes as large as (3.5 fm) 3 . (orig.)

  1. Pareto law of the expenditure of a person in convenience stores

    Science.gov (United States)

    Mizuno, Takayuki; Toriyama, Masahiro; Terano, Takao; Takayasu, Misako

    2008-06-01

    We study the statistical laws of the expenditure of a person in convenience stores by analyzing around 100 million receipts. The density function of expenditure exhibits a fat tail that follows a power law. Using the Lorenz curve, the Gini coefficient is estimated to be 0.70; this implies that loyal customers contribute significantly to a store’s sales. We observe the Pareto principle where both the top 25% and 2% of the customers account for 80% and 25% of the store’s sales, respectively.

  2. Interevent Time Distribution of Renewal Point Process, Case Study: Extreme Rainfall in South Sulawesi

    Science.gov (United States)

    Sunusi, Nurtiti

    2018-03-01

    The study of time distribution of occurrences of extreme rain phenomena plays a very important role in the analysis and weather forecast in an area. The timing of extreme rainfall is difficult to predict because its occurrence is random. This paper aims to determine the inter event time distribution of extreme rain events and minimum waiting time until the occurrence of next extreme event through a point process approach. The phenomenon of extreme rain events over a given period of time is following a renewal process in which the time for events is a random variable τ. The distribution of random variable τ is assumed to be a Pareto, Log Normal, and Gamma. To estimate model parameters, a moment method is used. Consider Rt as the time of the last extreme rain event at one location is the time difference since the last extreme rainfall event. if there are no extreme rain events up to t 0, there will be an opportunity for extreme rainfall events at (t 0, t 0 + δt 0). Furthermore from the three models reviewed, the minimum waiting time until the next extreme rainfall will be determined. The result shows that Log Nrmal model is better than Pareto and Gamma model for predicting the next extreme rainfall in South Sulawesi while the Pareto model can not be used.

  3. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  4. Dictatorship, liberalism and the Pareto rule: Possible and impossible

    Directory of Open Access Journals (Sweden)

    Boričić Branislav

    2009-01-01

    Full Text Available The current economic crisis has shaken belief in the capacity of neoliberal 'free market' policies. Numerous supports of state intervention have arisen, and the interest for social choice theory has revived. In this paper we consider three standard properties for aggregating individual into social preferences: dictatorship, liberalism and the Pareto rule, and their formal negations. The context of the pure first-order classical logic makes it possible to show how some combinations of the above mentioned conditions, under the hypothesis of unrestricted domain, form simple and reasonable examples of possible or impossible social choice systems. Due to their simplicity, these examples, including the famous 'liberal paradox', could have a particular didactic value.

  5. Pareto-optimal estimates that constrain mean California precipitation change

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-12-01

    Global climate model (GCM) projections of greenhouse gas-induced precipitation change can exhibit notable uncertainty at the regional scale, particularly in regions where the mean change is small compared to internal variability. This is especially true for California, which is located in a transition zone between robust precipitation increases to the north and decreases to the south, and where GCMs from the Climate Model Intercomparison Project phase 5 (CMIP5) archive show no consensus on mean change (in either magnitude or sign) across the central and southern parts of the state. With the goal of constraining this uncertainty, we apply a multiobjective approach to a large set of subensembles (subsets of models from the full CMIP5 ensemble). These constraints are based on subensemble performance in three fields important to California precipitation: tropical Pacific sea surface temperatures, upper-level zonal winds in the midlatitude Pacific, and precipitation over the state. An evolutionary algorithm is used to sort through and identify the set of Pareto-optimal subensembles across these three measures in the historical climatology, and we use this information to constrain end-of-century California wet season precipitation change. This technique narrows the range of projections throughout the state and increases confidence in estimates of positive mean change. Furthermore, these methods complement and generalize emergent constraint approaches that aim to restrict uncertainty in end-of-century projections, and they have applications to even broader aspects of uncertainty quantification, including parameter sensitivity and model calibration.

  6. Household Labour Supply in Britain and Denmark: Some Interpretations Using a Model of Pareto Optimal Behaviour

    DEFF Research Database (Denmark)

    Barmby, Tim; Smith, Nina

    1996-01-01

    This paper analyses the labour supply behaviour of households in Denmark and Britain. It employs models in which the preferences of individuals within the household are explicitly represented. The households are then assumed to decide on their labour supply in a Pareto-Optimal fashion. Describing...

  7. Modelling and Pareto optimization of heat transfer and flow coefficients in microchannels using GMDH type neural networks and genetic algorithms

    International Nuclear Information System (INIS)

    Amanifard, N.; Nariman-Zadeh, N.; Borji, M.; Khalkhali, A.; Habibdoust, A.

    2008-01-01

    Three-dimensional heat transfer characteristics and pressure drop of water flow in a set of rectangular microchannels are numerically investigated using Fluent and compared with those of experimental results. Two metamodels based on the evolved group method of data handling (GMDH) type neural networks are then obtained for modelling of both pressure drop (ΔP) and Nusselt number (Nu) with respect to design variables such as geometrical parameters of microchannels, the amount of heat flux and the Reynolds number. Using such obtained polynomial neural networks, multi-objective genetic algorithms (GAs) (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism is then used for Pareto based optimization of microchannels considering two conflicting objectives such as (ΔP) and (Nu). It is shown that some interesting and important relationships as useful optimal design principles involved in the performance of microchannels can be discovered by Pareto based multi-objective optimization of the obtained polynomial metamodels representing their heat transfer and flow characteristics. Such important optimal principles would not have been obtained without the use of both GMDH type neural network modelling and the Pareto optimization approach

  8. Heterogeneous Epidemic Model for Assessing Data Dissemination in Opportunistic Networks

    DEFF Research Database (Denmark)

    Rozanova, Liudmila; Alekseev, Vadim; Temerev, Alexander

    2014-01-01

    that amount of data transferred between network nodes possesses a Pareto distribution, implying scale-free properties. In this context, more heterogeneity in susceptibility means the less severe epidemic progression, and, on the contrary, more heterogeneity in infectivity leads to more severe epidemics...... — assuming that the other parameter (either heterogeneity or susceptibility) stays fixed. The results are general enough to be useful for estimating the epidemic progression with no significant acquired immunity — in the cases where Pareto distribution holds....

  9. Optimal beam margins in linac-based VMAT stereotactic ablative body radiotherapy: a Pareto front analysis for liver metastases.

    Science.gov (United States)

    Cilla, Savino; Ianiro, Anna; Deodato, Francesco; Macchia, Gabriella; Digesù, Cinzia; Valentini, Vincenzo; Morganti, Alessio G

    2017-11-27

    We explored the Pareto fronts mathematical strategy to determine the optimal block margin and prescription isodose for stereotactic body radiotherapy (SBRT) treatments of liver metastases using the volumetric-modulated arc therapy (VMAT) technique. Three targets (planning target volumes [PTVs] = 20, 55, and 101 cc) were selected. A single fraction dose of 26 Gy was prescribed (prescription dose [PD]). VMAT plans were generated for 3 different beam energies. Pareto fronts based on (1) different multileaf collimator (MLC) block margin around PTV and (2) different prescription isodose lines (IDL) were produced. For each block margin, the greatest IDL fulfilling the criteria (95% of PTV reached 100%) was considered as providing the optimal clinical plan for PTV coverage. Liver D mean , V7Gy, and V12Gy were used against the PTV coverage to generate the fronts. Gradient indexes (GI and mGI), homogeneity index (HI), and healthy liver irradiation in terms of D mean , V7Gy, and V12Gy were calculated to compare different plans. In addition, each target was also optimized with a full-inverse planning engine to obtain a direct comparison with anatomy-based treatment planning system (TPS) results. About 900 plans were calculated to generate the fronts. GI and mGI show a U-shaped behavior as a function of beam margin with minimal values obtained with a +1 mm MLC margin. For these plans, the IDL ranges from 74% to 86%. GI and mGI show also a V-shaped behavior with respect to HI index, with minimum values at 1 mm for all metrics, independent of tumor dimensions and beam energy. Full-inversed optimized plans reported worse results with respect to Pareto plans. In conclusion, Pareto fronts provide a rigorous strategy to choose clinical optimal plans in SBRT treatments. We show that a 1-mm MLC block margin provides the best results with regard to healthy liver tissue irradiation and steepness of dose fallout. Copyright © 2017 American Association of Medical Dosimetrists

  10. Mass hierarchy and energy scaling of the Tsallis - Pareto parameters in hadron productions at RHIC and LHC energies

    Science.gov (United States)

    Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Shen, Keming

    2018-02-01

    The latest, high-accuracy identified hadron spectra measurements in highenergy nuclear collisions led us to the investigation of the strongly interacting particles and collective effects in small systems. Since microscopical processes result in a statistical Tsallis - Pareto distribution, the fit parameters q and T are well suited for identifying system size scalings and initial conditions. Moreover, parameter values provide information on the deviation from the extensive, Boltzmann - Gibbs statistics in finite-volumes. We apply here the fit procedure developed in our earlier study for proton-proton collisions [1, 2]. The observed mass and center-of-mass energy trends in the hadron production are compared to RHIC dAu and LHC pPb data in different centrality/multiplicity classes. Here we present new results on mass hierarchy in pp and pA from light to heavy hadrons.

  11. Pareto optimization of an industrial ecosystem: sustainability maximization

    Directory of Open Access Journals (Sweden)

    J. G. M.-S. Monteiro

    2010-09-01

    Full Text Available This work investigates a procedure to design an Industrial Ecosystem for sequestrating CO2 and consuming glycerol in a Chemical Complex with 15 integrated processes. The Complex is responsible for the production of methanol, ethylene oxide, ammonia, urea, dimethyl carbonate, ethylene glycol, glycerol carbonate, β-carotene, 1,2-propanediol and olefins, and is simulated using UNISIM Design (Honeywell. The process environmental impact (EI is calculated using the Waste Reduction Algorithm, while Profit (P is estimated using classic cost correlations. MATLAB (The Mathworks Inc is connected to UNISIM to enable optimization. The objective is granting maximum process sustainability, which involves finding a compromise between high profitability and low environmental impact. Sustainability maximization is therefore understood as a multi-criteria optimization problem, addressed by means of the Pareto optimization methodology for trading off P vs. EI.

  12. Robust Design in Multiobjective Systems using Taguchi’s Parameter Design Approach and a Pareto Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Enrique Canessa

    2014-01-01

    Full Text Available Se presenta un Algoritmo Genético de Pareto (AGP, que encuentra la frontera de Pareto en problemas de diseño robusto para sistemas multiobjetivo. El AGP fue diseñado para ser aplicado usando el método de Diseño de Parámetros de Taguchi, el cual es el método más frecuentemente empleado por profesionales para ejecutar diseño robusto. El AGP se probó con datos obtenidos de un sistema real con una respuesta y de un simulador de procesos multiobjetivo con muchos factores de control y ruido. En todos los casos, el AGP entregó soluciones óptimas que cumplen con los objetivos del diseño robusto. Además, la discusión de resultados muestra que tener dichas soluciones ayuda en la selección de las mejores a ser implementadas en el sistema bajo estudio, especialmente cuando el sistema tiene muchos factores de control y salidas.

  13. A Pareto-based multi-objective optimization algorithm to design energy-efficient shading devices

    International Nuclear Information System (INIS)

    Khoroshiltseva, Marina; Slanzi, Debora; Poli, Irene

    2016-01-01

    Highlights: • We present a multi-objective optimization algorithm for shading design. • We combine Harmony search and Pareto-based procedures. • Thermal and daylighting performances of external shading were considered. • We applied the optimization process to a residential social housing in Madrid. - Abstract: In this paper we address the problem of designing new energy-efficient static daylight devices that will surround the external windows of a residential building in Madrid. Shading devices can in fact largely influence solar gains in a building and improve thermal and lighting comforts by selectively intercepting the solar radiation and by reducing the undesirable glare. A proper shading device can therefore significantly increase the thermal performance of a building by reducing its energy demand in different climate conditions. In order to identify the set of optimal shading devices that allow a low energy consumption of the dwelling while maintaining high levels of thermal and lighting comfort for the inhabitants we derive a multi-objective optimization methodology based on Harmony Search and Pareto front approaches. The results show that the multi-objective approach here proposed is an effective procedure in designing energy efficient shading devices when a large set of conflicting objectives characterizes the performance of the proposed solutions.

  14. Sensitivity of goodness-of-fit statistics to rainfall data rounding off

    Science.gov (United States)

    Deidda, Roberto; Puliga, Michelangelo

    An analysis based on the L-moments theory suggests of adopting the generalized Pareto distribution to interpret daily rainfall depths recorded by the rain-gauge network of the Hydrological Survey of the Sardinia Region. Nevertheless, a big problem, not yet completely resolved, arises in the estimation of a left-censoring threshold able to assure a good fitting of rainfall data with the generalized Pareto distribution. In order to detect an optimal threshold, keeping the largest possible number of data, we chose to apply a “failure-to-reject” method based on goodness-of-fit tests, as it was proposed by Choulakian and Stephens [Choulakian, V., Stephens, M.A., 2001. Goodness-of-fit tests for the generalized Pareto distribution. Technometrics 43, 478-484]. Unfortunately, the application of the test, using percentage points provided by Choulakian and Stephens (2001), did not succeed in detecting a useful threshold value in most analyzed time series. A deeper analysis revealed that these failures are mainly due to the presence of large quantities of rounding off values among sample data, affecting the distribution of goodness-of-fit statistics and leading to significant departures from percentage points expected for continuous random variables. A procedure based on Monte Carlo simulations is thus proposed to overcome these problems.

  15. Modified Stieltjes Transform and Generalized Convolutions of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Lev B. Klebanov

    2018-01-01

    Full Text Available The classical Stieltjes transform is modified in such a way as to generalize both Stieltjes and Fourier transforms. This transform allows the introduction of new classes of commutative and non-commutative generalized convolutions. A particular case of such a convolution for degenerate distributions appears to be the Wigner semicircle distribution.

  16. Sea-ice floe-size distribution in the context of spontaneous scaling emergence in stochastic systems

    Science.gov (United States)

    Herman, Agnieszka

    2010-06-01

    Sea-ice floe-size distribution (FSD) in ice-pack covered seas influences many aspects of ocean-atmosphere interactions. However, data concerning FSD in the polar oceans are still sparse and processes shaping the observed FSD properties are poorly understood. Typically, power-law FSDs are assumed although no feasible explanation has been provided neither for this one nor for other properties of the observed distributions. Consequently, no model exists capable of predicting FSD parameters in any particular situation. Here I show that the observed FSDs can be well represented by a truncated Pareto distribution P(x)=x-1-αexp[(1-α)/x] , which is an emergent property of a certain group of multiplicative stochastic systems, described by the generalized Lotka-Volterra (GLV) equation. Building upon this recognition, a possibility of developing a simple agent-based GLV-type sea-ice model is considered. Contrary to simple power-law FSDs, GLV gives consistent estimates of the total floe perimeter, as well as floe-area distribution in agreement with observations.

  17. Optimization of externalities using DTM measures: a Pareto optimal multi objective optimization using the evolutionary algorithm SPEA2+

    NARCIS (Netherlands)

    Wismans, Luc Johannes Josephus; van Berkum, Eric C.; Bliemer, Michiel; Allkim, T.P.; van Arem, Bart

    2010-01-01

    Multi objective optimization of externalities of traffic is performed solving a network design problem in which Dynamic Traffic Management measures are used. The resulting Pareto optimal set is determined by employing the SPEA2+ evolutionary algorithm.

  18. Discrepancies between selected Pareto optimal plans and final deliverable plans in radiotherapy multi-criteria optimization.

    Science.gov (United States)

    Kyroudi, Archonteia; Petersson, Kristoffer; Ghandour, Sarah; Pachoud, Marc; Matzinger, Oscar; Ozsahin, Mahmut; Bourhis, Jean; Bochud, François; Moeckli, Raphaël

    2016-08-01

    Multi-criteria optimization provides decision makers with a range of clinical choices through Pareto plans that can be explored during real time navigation and then converted into deliverable plans. Our study shows that dosimetric differences can arise between the two steps, which could compromise the clinical choices made during navigation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Prederivatives of gamma paraconvex set-valued maps and Pareto optimality conditions for set optimization problems.

    Science.gov (United States)

    Huang, Hui; Ning, Jixian

    2017-01-01

    Prederivatives play an important role in the research of set optimization problems. First, we establish several existence theorems of prederivatives for γ -paraconvex set-valued mappings in Banach spaces with [Formula: see text]. Then, in terms of prederivatives, we establish both necessary and sufficient conditions for the existence of Pareto minimal solution of set optimization problems.

  20. Hybrid Pareto artificial bee colony algorithm for multi-objective single machine group scheduling problem with sequence-dependent setup times and learning effects.

    Science.gov (United States)

    Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao

    2016-01-01

    Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.

  1. MULTI-OBJECTIVE OPTIMAL DESIGN OF GROUNDWATER REMEDIATION SYSTEMS: APPLICATION OF THE NICHED PARETO GENETIC ALGORITHM (NPGA). (R826614)

    Science.gov (United States)

    A multiobjective optimization algorithm is applied to a groundwater quality management problem involving remediation by pump-and-treat (PAT). The multiobjective optimization framework uses the niched Pareto genetic algorithm (NPGA) and is applied to simultaneously minimize the...

  2. Pareto-Ranking Based Quantum-Behaved Particle Swarm Optimization for Multiobjective Optimization

    Directory of Open Access Journals (Sweden)

    Na Tian

    2015-01-01

    Full Text Available A study on pareto-ranking based quantum-behaved particle swarm optimization (QPSO for multiobjective optimization problems is presented in this paper. During the iteration, an external repository is maintained to remember the nondominated solutions, from which the global best position is chosen. The comparison between different elitist selection strategies (preference order, sigma value, and random selection is performed on four benchmark functions and two metrics. The results demonstrate that QPSO with preference order has comparative performance with sigma value according to different number of objectives. Finally, QPSO with sigma value is applied to solve multiobjective flexible job-shop scheduling problems.

  3. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  4. Pareto Efficient Solutions of Attack-Defence Trees

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming

    2015-01-01

    Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes, such as proba......Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes......, such as probability or cost of attacks and defences. In case of multiple parameters most analytical methods optimise one parameter at a time, e.g., minimise cost or maximise probability of an attack. Such methods may lead to sub-optimal solutions when optimising conflicting parameters, e.g., minimising cost while...... maximising probability. In order to tackle this challenge, we devise automated techniques that optimise all parameters at once. Moreover, in the case of conflicting parameters our techniques compute the set of all optimal solutions, defined in terms of Pareto efficiency. The developments are carried out...

  5. A generalized statistical model for the size distribution of wealth

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2012-01-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature. (paper)

  6. A generalized statistical model for the size distribution of wealth

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  7. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  8. The interrupted power law and the size of shadow banking.

    Science.gov (United States)

    Fiaschi, Davide; Kondor, Imre; Marsili, Matteo; Volpati, Valerio

    2014-01-01

    Using public data (Forbes Global 2000) we show that the asset sizes for the largest global firms follow a Pareto distribution in an intermediate range, that is "interrupted" by a sharp cut-off in its upper tail, where it is totally dominated by financial firms. This flattening of the distribution contrasts with a large body of empirical literature which finds a Pareto distribution for firm sizes both across countries and over time. Pareto distributions are generally traced back to a mechanism of proportional random growth, based on a regime of constant returns to scale. This makes our findings of an "interrupted" Pareto distribution all the more puzzling, because we provide evidence that financial firms in our sample should operate in such a regime. We claim that the missing mass from the upper tail of the asset size distribution is a consequence of shadow banking activity and that it provides an (upper) estimate of the size of the shadow banking system. This estimate-which we propose as a shadow banking index-compares well with estimates of the Financial Stability Board until 2009, but it shows a sharper rise in shadow banking activity after 2010. Finally, we propose a proportional random growth model that reproduces the observed distribution, thereby providing a quantitative estimate of the intensity of shadow banking activity.

  9. On the probability distribution of daily streamflow in the United States

    Science.gov (United States)

    Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.

    2017-06-01

    Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.

  10. Patient feature based dosimetric Pareto front prediction in esophageal cancer radiotherapy.

    Science.gov (United States)

    Wang, Jiazhou; Jin, Xiance; Zhao, Kuaike; Peng, Jiayuan; Xie, Jiang; Chen, Junchao; Zhang, Zhen; Studenski, Matthew; Hu, Weigang

    2015-02-01

    To investigate the feasibility of the dosimetric Pareto front (PF) prediction based on patient's anatomic and dosimetric parameters for esophageal cancer patients. Eighty esophagus patients in the authors' institution were enrolled in this study. A total of 2928 intensity-modulated radiotherapy plans were obtained and used to generate PF for each patient. On average, each patient had 36.6 plans. The anatomic and dosimetric features were extracted from these plans. The mean lung dose (MLD), mean heart dose (MHD), spinal cord max dose, and PTV homogeneity index were recorded for each plan. Principal component analysis was used to extract overlap volume histogram (OVH) features between PTV and other organs at risk. The full dataset was separated into two parts; a training dataset and a validation dataset. The prediction outcomes were the MHD and MLD. The spearman's rank correlation coefficient was used to evaluate the correlation between the anatomical features and dosimetric features. The stepwise multiple regression method was used to fit the PF. The cross validation method was used to evaluate the model. With 1000 repetitions, the mean prediction error of the MHD was 469 cGy. The most correlated factor was the first principal components of the OVH between heart and PTV and the overlap between heart and PTV in Z-axis. The mean prediction error of the MLD was 284 cGy. The most correlated factors were the first principal components of the OVH between heart and PTV and the overlap between lung and PTV in Z-axis. It is feasible to use patients' anatomic and dosimetric features to generate a predicted Pareto front. Additional samples and further studies are required improve the prediction model.

  11. A heuristic ranking approach on capacity benefit margin determination using Pareto-based evolutionary programming technique.

    Science.gov (United States)

    Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas

    2015-01-01

    This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  12. A Heuristic Ranking Approach on Capacity Benefit Margin Determination Using Pareto-Based Evolutionary Programming Technique

    Directory of Open Access Journals (Sweden)

    Muhammad Murtadha Othman

    2015-01-01

    Full Text Available This paper introduces a novel multiobjective approach for capacity benefit margin (CBM assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE in various conditions. Eventually, the power transfer based available transfer capability (ATC is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  13. A Pareto archive floating search procedure for solving multi-objective flexible job shop scheduling problem

    Directory of Open Access Journals (Sweden)

    J. S. Sadaghiani

    2014-04-01

    Full Text Available Flexible job shop scheduling problem is a key factor of using efficiently in production systems. This paper attempts to simultaneously optimize three objectives including minimization of the make span, total workload and maximum workload of jobs. Since the multi objective flexible job shop scheduling problem is strongly NP-Hard, an integrated heuristic approach has been used to solve it. The proposed approach was based on a floating search procedure that has used some heuristic algorithms. Within floating search procedure utilize local heuristic algorithms; it makes the considered problem into two sections including assigning and sequencing sub problem. First of all search is done upon assignment space achieving an acceptable solution and then search would continue on sequencing space based on a heuristic algorithm. This paper has used a multi-objective approach for producing Pareto solution. Thus proposed approach was adapted on NSGA II algorithm and evaluated Pareto-archives. The elements and parameters of the proposed algorithms were adjusted upon preliminary experiments. Finally, computational results were used to analyze efficiency of the proposed algorithm and this results showed that the proposed algorithm capable to produce efficient solutions.

  14. Multiobjective memetic estimation of distribution algorithm based on an incremental tournament local searcher.

    Science.gov (United States)

    Yang, Kaifeng; Mu, Li; Yang, Dongdong; Zou, Feng; Wang, Lei; Jiang, Qiaoyong

    2014-01-01

    A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  15. Multiobjective Memetic Estimation of Distribution Algorithm Based on an Incremental Tournament Local Searcher

    Directory of Open Access Journals (Sweden)

    Kaifeng Yang

    2014-01-01

    Full Text Available A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  16. Multi-objective optimization design of air distribution of grate cooler by entropy generation minimization and genetic algorithm

    International Nuclear Information System (INIS)

    Shao, Wei; Cui, Zheng; Cheng, Lin

    2016-01-01

    Highlights: • A multi-objective optimization model of air distribution of grate cooler by genetic algorithm is proposed. • Pareto Front is obtained and validated by comparing with operating data. • Optimal schemes are compared and selected by engineering background. • Total power consumption after optimization decreases 61.10%. • Thickness of clinker on three grate plates is thinner. - Abstract: The cooling air distributions of grate cooler exercise a great influence on the clinker cooling efficiency and power consumption of cooling fans. A multi-objective optimization model of air distributions of grate cooler with cross-flow heat exchanger analogy is proposed in this paper. Firstly, thermodynamic and flow models of clinker cooling process is carried out. Then based on entropy generation minimization analysis, modified entropy generation numbers caused by heat transfer and pressure drop are chosen as objective functions respectively which optimized by genetic algorithm. The design variables are superficial velocities of air chambers and thicknesses of clinker layers on different grate plates. A set of Pareto optimal solutions which two objectives are optimized simultaneously is achieved. Scattered distributions of design variables resulting in the conflict between two objectives are brought out. The final optimal air distribution and thicknesses of clinker layers are selected from the Pareto optimal solutions based on power consumption of cooling fans minimization and validated by measurements. Compared with actual operating scheme, the total air volumes of optimized schemes decrease 2.4%, total power consumption of cooling fans decreases 61.1% and the outlet temperature of clinker decreases 122.9 °C which shows a remarkable energy-saving effect on energy consumption.

  17. Beyond lognormal inequality: The Lorenz Flow Structure

    Science.gov (United States)

    Eliazar, Iddo

    2016-11-01

    Observed from a socioeconomic perspective, the intrinsic inequality of the lognormal law happens to manifest a flow generated by an underlying ordinary differential equation. In this paper we extend this feature of the lognormal law to a general ;Lorenz Flow Structure; of Lorenz curves-objects that quantify socioeconomic inequality. The Lorenz Flow Structure establishes a general framework of size distributions that span continuous spectra of socioeconomic states ranging from the pure-communism extreme to the absolute-monarchy extreme. This study introduces and explores the Lorenz Flow Structure, analyzes its statistical properties and its inequality properties, unveils the unique role of the lognormal law within this general structure, and presents various examples of this general structure. Beyond the lognormal law, the examples include the inverse-Pareto and Pareto laws-which often govern the tails of composite size distributions.

  18. Chiral perturbation theory for generalized parton distributions and baryon distribution amplitudes

    Energy Technology Data Exchange (ETDEWEB)

    Wein, Philipp

    2016-05-06

    In this thesis we apply low-energy effective field theory to the first moments of generalized parton distributions and to baryon distribution amplitudes, which are both highly relevant for the parametrization of the nonperturbative part in hard processes. These quantities yield complementary information on hadron structure, since the former treat hadrons as a whole and, thus, give information about the (angular) momentum carried by an entire parton species on average, while the latter parametrize the momentum distribution within an individual Fock state. By performing one-loop calculations within covariant baryon chiral perturbation theory, we obtain sensible parametrizations of the quark mass dependence that are ideally suited for the subsequent analysis of lattice QCD data.

  19. Income distribution in the Colombian economy from an econophysics perspective

    Directory of Open Access Journals (Sweden)

    Hernando Quevedo Cubillos

    2016-09-01

    Full Text Available Recently, in econophysics, it has been shown that it is possible to analyze economic systems as equilibrium thermodynamic models. We apply statistical thermodynamics methods to analyze income distribution in the Colombian economic system. Using the data obtained in random polls, we show that income distribution in the Colombian economic system is characterized by two specific phases. The first includes about 90% of the interviewed individuals, and is characterized by an exponential Boltzmann-Gibbs distribution. The second phase, which contains the individuals with the highest incomes, can be described by means of one or two power-law density distributions that are known as Pareto distributions.

  20. General results for the Marshall and Olkin's family of distributions

    Directory of Open Access Journals (Sweden)

    WAGNER BARRETO-SOUZA

    2013-03-01

    Full Text Available Abstract Marshall and Olkin (1997 introduced an interesting method of adding a parameter to a well-established distribution. However, they did not investigate general mathematical properties of their family of distributions. We provide for this family of distributions general expansions for the density function, explicit expressions for the moments and moments of the order statistics. Several especial models are investigated. We discuss estimation of the model parameters. An application to a real data set is presented for illustrative purposes.

  1. Pareto-optimal reversed-phase chromatography separation of three insulin variants with a solubility constraint.

    Science.gov (United States)

    Arkell, Karolina; Knutson, Hans-Kristian; Frederiksen, Søren S; Breil, Martin P; Nilsson, Bernt

    2018-01-12

    With the shift of focus of the regulatory bodies, from fixed process conditions towards flexible ones based on process understanding, model-based optimization is becoming an important tool for process development within the biopharmaceutical industry. In this paper, a multi-objective optimization study of separation of three insulin variants by reversed-phase chromatography (RPC) is presented. The decision variables were the load factor, the concentrations of ethanol and KCl in the eluent, and the cut points for the product pooling. In addition to the purity constraints, a solubility constraint on the total insulin concentration was applied. The insulin solubility is a function of the ethanol concentration in the mobile phase, and the main aim was to investigate the effect of this constraint on the maximal productivity. Multi-objective optimization was performed with and without the solubility constraint, and visualized as Pareto fronts, showing the optimal combinations of the two objectives productivity and yield for each case. Comparison of the constrained and unconstrained Pareto fronts showed that the former diverges when the constraint becomes active, because the increase in productivity with decreasing yield is almost halted. Consequently, we suggest the operating point at which the total outlet concentration of insulin reaches the solubility limit as the most suitable one. According to the results from the constrained optimizations, the maximal productivity on the C 4 adsorbent (0.41 kg/(m 3  column h)) is less than half of that on the C 18 adsorbent (0.87 kg/(m 3  column h)). This is partly caused by the higher selectivity between the insulin variants on the C 18 adsorbent, but the main reason is the difference in how the solubility constraint affects the processes. Since the optimal ethanol concentration for elution on the C 18 adsorbent is higher than for the C 4 one, the insulin solubility is also higher, allowing a higher pool concentration

  2. Generalized Parton Distributions and their Singularities

    Energy Technology Data Exchange (ETDEWEB)

    Anatoly Radyushkin

    2011-04-01

    A new approach to building models of generalized parton distributions (GPDs) is discussed that is based on the factorized DD (double distribution) Ansatz within the single-DD formalism. The latter was not used before, because reconstructing GPDs from the forward limit one should start in this case with a very singular function $f(\\beta)/\\beta$ rather than with the usual parton density $f(\\beta)$. This results in a non-integrable singularity at $\\beta=0$ exaggerated by the fact that $f(\\beta)$'s, on their own, have a singular $\\beta^{-a}$ Regge behavior for small $\\beta$. It is shown that the singularity is regulated within the GPD model of Szczepaniak et al., in which the Regge behavior is implanted through a subtracted dispersion relation for the hadron-parton scattering amplitude. It is demonstrated that using proper softening of the quark-hadron vertices in the regions of large parton virtualities results in model GPDs $H(x,\\xi)$ that are finite and continuous at the "border point'' $x=\\xi$. Using a simple input forward distribution, we illustrate the implementation of the new approach for explicit construction of model GPDs. As a further development, a more general method of regulating the $\\beta=0$ singularities is proposed that is based on the separation of the initial single DD $f(\\beta, \\alpha)$ into the "plus'' part $[f(\\beta,\\alpha)]_{+}$ and the $D$-term. It is demonstrated that the "DD+D'' separation method allows to (re)derive GPD sum rules that relate the difference between the forward distribution $f(x)=H(x,0)$ and the border function $H(x,x)$ with the $D$-term function $D(\\alpha)$.

  3. An introduction to the Generalized Parton Distributions

    International Nuclear Information System (INIS)

    Michel Garcon

    2002-01-01

    The concepts of Generalized Parton Distributions (GPD) are reviewed in an introductory and phenomenological fashion. These distributions provide a rich and unifying picture of the nucleon structure. Their physical meaning is discussed. The GPD are in principle measurable through exclusive deeply virtual production of photons (DVCS) or of mesons (DVMP). Experiments are starting to test the validity of these concepts. First results are discussed and new experimental projects presented, with an emphasis on this program at Jefferson Lab

  4. Test scheduling optimization for 3D network-on-chip based on cloud evolutionary algorithm of Pareto multi-objective

    Science.gov (United States)

    Xu, Chuanpei; Niu, Junhao; Ling, Jing; Wang, Suyan

    2018-03-01

    In this paper, we present a parallel test strategy for bandwidth division multiplexing under the test access mechanism bandwidth constraint. The Pareto solution set is combined with a cloud evolutionary algorithm to optimize the test time and power consumption of a three-dimensional network-on-chip (3D NoC). In the proposed method, all individuals in the population are sorted in non-dominated order and allocated to the corresponding level. Individuals with extreme and similar characteristics are then removed. To increase the diversity of the population and prevent the algorithm from becoming stuck around local optima, a competition strategy is designed for the individuals. Finally, we adopt an elite reservation strategy and update the individuals according to the cloud model. Experimental results show that the proposed algorithm converges to the optimal Pareto solution set rapidly and accurately. This not only obtains the shortest test time, but also optimizes the power consumption of the 3D NoC.

  5. Optimizing a Biobjective Production-Distribution Planning Problem Using a GRASP

    Directory of Open Access Journals (Sweden)

    Martha-Selene Casas-Ramírez

    2018-01-01

    Full Text Available This paper addresses a biobjective production-distribution planning problem. The problem is formulated as a mixed integer programming problem with two objectives. The objectives are to minimize the total costs and to balance the total workload of the supply chain, which consist of plants and depots, considering that it represents a company vertically integrated. In order to solve the model, we propose an adapted biobjective GRASP to obtain an approximation of the Pareto front. To evaluate the performance of the proposed algorithm, numerical experimentations are conducted over a set of instances used for similar problems. Results indicate that the proposed GRASP obtains a relatively small number of nondominated solutions for each tested instance in very short computational time. The approximated Pareto fronts are discontinuous and nonconvex. Moreover, the solutions clearly show the compromise between both objective functions.

  6. Phase transitions in Pareto optimal complex networks.

    Science.gov (United States)

    Seoane, Luís F; Solé, Ricard

    2015-09-01

    The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.

  7. Sea-ice floe-size distribution in the context of spontaneous scaling emergence in stochastic systems.

    Science.gov (United States)

    Herman, Agnieszka

    2010-06-01

    Sea-ice floe-size distribution (FSD) in ice-pack covered seas influences many aspects of ocean-atmosphere interactions. However, data concerning FSD in the polar oceans are still sparse and processes shaping the observed FSD properties are poorly understood. Typically, power-law FSDs are assumed although no feasible explanation has been provided neither for this one nor for other properties of the observed distributions. Consequently, no model exists capable of predicting FSD parameters in any particular situation. Here I show that the observed FSDs can be well represented by a truncated Pareto distribution P(x)=x(-1-α) exp[(1-α)/x] , which is an emergent property of a certain group of multiplicative stochastic systems, described by the generalized Lotka-Volterra (GLV) equation. Building upon this recognition, a possibility of developing a simple agent-based GLV-type sea-ice model is considered. Contrary to simple power-law FSDs, GLV gives consistent estimates of the total floe perimeter, as well as floe-area distribution in agreement with observations.

  8. Characterizing the Incentive Compatible and Pareto Optimal Efficiency Space for Two Players, k Items, Public Budget and Quasilinear Utilities

    Directory of Open Access Journals (Sweden)

    Anat Lerner

    2014-04-01

    Full Text Available We characterize the efficiency space of deterministic, dominant-strategy incentive compatible, individually rational and Pareto-optimal combinatorial auctions in a model with two players and k nonidentical items. We examine a model with multidimensional types, private values and quasilinear preferences for the players with one relaxation: one of the players is subject to a publicly known budget constraint. We show that if it is publicly known that the valuation for the largest bundle is less than the budget for at least one of the players, then Vickrey-Clarke-Groves (VCG uniquely fulfills the basic properties of being deterministic, dominant-strategy incentive compatible, individually rational and Pareto optimal. Our characterization of the efficient space for deterministic budget constrained combinatorial auctions is similar in spirit to that of Maskin 2000 for Bayesian single-item constrained efficiency auctions and comparable with Ausubel and Milgrom 2002 for non-constrained combinatorial auctions.

  9. A Fiducial Approach to Extremes and Multiple Comparisons

    Science.gov (United States)

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  10. Generalization of Poisson distribution for the case of changing probability of consequential events

    International Nuclear Information System (INIS)

    Kushnirenko, E.

    1995-01-01

    The generalization of the Poisson distribution for the case of changing probabilities of the consequential events is done. It is shown that the classical Poisson distribution is the special case of this generalized distribution when the probabilities of the consequential events are constant. The using of the generalized Poisson distribution gives the possibility in some cases to obtain analytical result instead of making Monte-Carlo calculation

  11. Distance associated with marriage migration in a northern and a southern region of Bangladesh: an empirical study.

    Science.gov (United States)

    Rahman, Md Mizanur; Akter, Shamima; Rahman, Ataur

    2010-09-01

    This paper investigates the distribution of distance associated with marriage migration in the northern region of Rajshahi and the southern region of Khulna in Bangladesh. The study was conducted in 2007 on 2250 respondents who had migrated due to marriage. Of the wide variety of curves that fitted the distance-marriage/contact data, three are discussed: Pareto, exponential, and Pareto-exponential. Logistic regression models were used to identify the covariates of marriage distance migration. In general, the three functions work better for marriages, whereas Pareto-exponential functions are a superior fit for migrations and marriage distance. The models disclose that the distribution of distance is significantly associated with marriage migration (pPareto-exponential model was 100% stable and its shrinkage was 0.000000125. The main covariates associated with short-distance marriage migration were respondent's education, father's education and religion, whereas age at the time of marriage did not play a significant role in marriage migration. The risk of short-distance migration was greater in higher- than lower-educated Muslim families.

  12. Identifying the preferred subset of enzymatic profiles in nonlinear kinetic metabolic models via multiobjective global optimization and Pareto filters.

    Directory of Open Access Journals (Sweden)

    Carlos Pozo

    Full Text Available Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study

  13. Identifying the preferred subset of enzymatic profiles in nonlinear kinetic metabolic models via multiobjective global optimization and Pareto filters.

    Science.gov (United States)

    Pozo, Carlos; Guillén-Gosálbez, Gonzalo; Sorribas, Albert; Jiménez, Laureano

    2012-01-01

    Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the

  14. The κ-generalized distribution: A new descriptive model for the size distribution of incomes

    Science.gov (United States)

    Clementi, F.; Di Matteo, T.; Gallegati, M.; Kaniadakis, G.

    2008-05-01

    This paper proposes the κ-generalized distribution as a model for describing the distribution and dispersion of income within a population. Formulas for the shape, moments and standard tools for inequality measurement-such as the Lorenz curve and the Gini coefficient-are given. A method for parameter estimation is also discussed. The model is shown to fit extremely well the data on personal income distribution in Australia and in the United States.

  15. Energy Optimization for Distributed Energy Resources Scheduling with Enhancements in Voltage Stability Margin

    DEFF Research Database (Denmark)

    Morais, Hugo; Sousa, Tiago; Perez, Angel

    2016-01-01

    to evaluate the resulting multiobjective optimization problem: the sum-weighted Pareto front and an adapted goal programming methodology. With this new methodology, the system operators can consider both the costs and voltage stability. Priority can be assigned to one objective function according...... to the operating scenario. Additionally, it is possible to evaluate the impact of the distributed generation and the electric vehicles in the management of voltage stability in the future electric networks. One detailed case study considering a distribution network with high penetration of distributed energy...

  16. Universal cervical length screening for singleton pregnancies with no history of preterm delivery, or the inverse of the Pareto principle.

    Science.gov (United States)

    Rozenberg, P

    2017-06-01

    Ultrasound measurement of cervical length in the general population enables the identification of women at risk for spontaneous preterm delivery. Vaginal progesterone is effective in reducing the risk of preterm delivery in this population. This screening associated with treatment by vaginal progesterone is cost-effective. Universal screening of cervical length can therefore be considered justified. Nonetheless, this screening will not appreciably reduce the preterm birth prevalence: in France or UK, where the preterm delivery rate is around 7.4%, this strategy would make it possible to reduce it only to 7.0%. This small benefit must be set against the considerable effort required in terms of screening ultrasound scans. Universal ultrasound screening of cervical length is the inverse of Pareto's principle: a small benefit against a considerable effort. © 2016 Royal College of Obstetricians and Gynaecologists.

  17. A new generalization of the Pareto–geometric distribution

    Directory of Open Access Journals (Sweden)

    M. Nassar

    2013-07-01

    Full Text Available In this paper we introduce a new distribution called the beta Pareto–geometric. We provide a comprehensive treatment of the mathematical properties of the proposed distribution and derive expressions for its moment generating function and the rth generalized moment. We discuss estimation of the parameters by maximum likelihood and obtain the information matrix that is easily numerically determined. We also demonstrate its usefulness on a real data set.

  18. Pareto analysis of critical factors affecting technical institution evaluation

    Directory of Open Access Journals (Sweden)

    Victor Gambhir

    2012-08-01

    Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.

  19. Modelling of snow exceedances

    Science.gov (United States)

    Jordanova, Pavlina K.; Sadovský, Zoltán; Stehlík, Milan

    2017-07-01

    Modelling of snow exceedances is of great importance and interest for ecology, civil engineering and general public. We suggest the favorable fit for exceedances related to the exceptional snow loads from Slovakia, assuming that the data is driven by Generalised Pareto Distribution or Generalized Extreme Value Distribution. Further, the statistical dependence between the maximal snow loads and the corresponding altitudes is studied.

  20. Finding the Pareto Optimal Equitable Allocation of Homogeneous Divisible Goods Among Three Players

    Directory of Open Access Journals (Sweden)

    Marco Dall'Aglio

    2017-01-01

    Full Text Available We consider the allocation of a finite number of homogeneous divisible items among three players. Under the assumption that each player assigns a positive value to every item, we develop a simple algorithm that returns a Pareto optimal and equitable allocation. This is based on the tight relationship between two geometric objects of fair division: The Individual Pieces Set (IPS and the Radon-Nykodim Set (RNS. The algorithm can be considered as an extension of the Adjusted Winner procedure by Brams and Taylor to the three-player case, without the guarantee of envy-freeness. (original abstract

  1. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  2. Multi-objective optimization of water quality, pumps operation, and storage sizing of water distribution systems.

    Science.gov (United States)

    Kurek, Wojciech; Ostfeld, Avi

    2013-01-30

    A multi-objective methodology utilizing the Strength Pareto Evolutionary Algorithm (SPEA2) linked to EPANET for trading-off pumping costs, water quality, and tanks sizing of water distribution systems is developed and demonstrated. The model integrates variable speed pumps for modeling the pumps operation, two water quality objectives (one based on chlorine disinfectant concentrations and one on water age), and tanks sizing cost which are assumed to vary with location and diameter. The water distribution system is subject to extended period simulations, variable energy tariffs, Kirchhoff's laws 1 and 2 for continuity of flow and pressure, tanks water level closure constraints, and storage-reliability requirements. EPANET Example 3 is employed for demonstrating the methodology on two multi-objective models, which differ in the imposed water quality objective (i.e., either with disinfectant or water age considerations). Three-fold Pareto optimal fronts are presented. Sensitivity analysis on the storage-reliability constraint, its influence on pumping cost, water quality, and tank sizing are explored. The contribution of this study is in tailoring design (tank sizing), pumps operational costs, water quality of two types, and reliability through residual storage requirements, in a single multi-objective framework. The model was found to be stable in generating multi-objective three-fold Pareto fronts, while producing explainable engineering outcomes. The model can be used as a decision tool for both pumps operation, water quality, required storage for reliability considerations, and tank sizing decision-making. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Momentum transfer dependence of generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Neetika [Indian Institute of Science Education and Research Mohali, S.A.S. Nagar, Punjab (India)

    2016-11-15

    We revisit the model for parametrization of the momentum dependence of nucleon generalized parton distributions in the light of recent MRST measurements of parton distribution functions (A.D. Martin et al., Eur. Phys. J. C 63, 189 (2009)). Our parametrization method with a minimum set of free parameters give a sufficiently good description of data for Dirac and Pauli electromagnetic form factors of proton and neutron at small and intermediate values of momentum transfer. We also calculate the GPDs for up- and down-quarks by decomposing the electromagnetic form factors for the nucleon using the charge and isospin symmetry and also study the evolution of GPDs to a higher scale. We further investigate the transverse charge densities for both the unpolarized and transversely polarized nucleon and compare our results with Kelly's distribution. (orig.)

  4. Analytic hierarchy process-based approach for selecting a Pareto-optimal solution of a multi-objective, multi-site supply-chain planning problem

    Science.gov (United States)

    Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi

    2017-07-01

    The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.

  5. Experimental studies of generalized parton distributions

    International Nuclear Information System (INIS)

    Kabuss, E.M.

    2014-01-01

    Generalized parton distributions (GPD) provide a new way to study the nucleon structure. Experimentally they can be accessed using hard exclusive processes such as deeply virtual Compton scattering and meson production. First insights to GPDs were already obtained from measurements at DESY, JLAB and CERN, while new ambitious studies are planned at the upgraded JLAB at 12 GeV and at CERN. Here, some emphasis will be put onto the planned COMPASS II programme. (author)

  6. Application of Generalized Student’s T-Distribution In Modeling The Distribution of Empirical Return Rates on Selected Stock Exchange Indexes

    Directory of Open Access Journals (Sweden)

    Purczyńskiz Jan

    2014-07-01

    Full Text Available This paper examines the application of the so called generalized Student’s t-distribution in modeling the distribution of empirical return rates on selected Warsaw stock exchange indexes. It deals with distribution parameters by means of the method of logarithmic moments, the maximum likelihood method and the method of moments. Generalized Student’s t-distribution ensures better fitting to empirical data than the classical Student’s t-distribution.

  7. Tradeable CO{sub 2} emission permits: initial distribution as a justice problem

    Energy Technology Data Exchange (ETDEWEB)

    Kverndokk, S. [Stiftelsen for Samfunns- og Naeringslivsforskning, Oslo (Norway)

    1992-11-01

    Tradeable emission permits are one of the most discussed policy instruments to implement international agreements on CO{sub 2} emission reductions. One characteristic of this instrument is that it separates the questions of efficiency and justice; in an idealised world, efficiency is achieved no matter how the permits are distributed. By assuming separability of inter- and intragenerational justice, the author can discuss the initial distribution of permits as an intragenerational distributive justice problem. In contrast to efficiency, where Pareto Optimality is an overall accepted principle, there is no consensus on a ``best`` equity principle. Different principles lead to different rules for distribution. The framework is to consider what the author believe to be metaprinciples of theories of justice; ethical individualism and presentism, as well as a generally accepted principle of avoiding morally arbitrary components as standards for distribution. Using these principles in an exclusionary way, working with a list of alternative allocation rules, a distribution proportional to population is recommended. Arguments against this rule are discussed, and special attention is paid to political feasibility. Justice and political feasibility may contrast, so also in this case. Even if a distribution based only on population may be politically unacceptable, there may be prospects to use this criterion in combination with other rules, as well as to put more weight on it in the future. 26 refs.

  8. Tradeable CO[sub 2] emission permits: initial distribution as a justice problem

    Energy Technology Data Exchange (ETDEWEB)

    Kverndokk, S. (Stiftelsen for Samfunns- og Naeringslivsforskning, Oslo (Norway))

    1992-11-01

    Tradeable emission permits are one of the most discussed policy instruments to implement international agreements on CO[sub 2] emission reductions. One characteristic of this instrument is that it separates the questions of efficiency and justice; in an idealised world, efficiency is achieved no matter how the permits are distributed. By assuming separability of inter- and intragenerational justice, the author can discuss the initial distribution of permits as an intragenerational distributive justice problem. In contrast to efficiency, where Pareto Optimality is an overall accepted principle, there is no consensus on a ''best'' equity principle. Different principles lead to different rules for distribution. The framework is to consider what the author believe to be metaprinciples of theories of justice; ethical individualism and presentism, as well as a generally accepted principle of avoiding morally arbitrary components as standards for distribution. Using these principles in an exclusionary way, working with a list of alternative allocation rules, a distribution proportional to population is recommended. Arguments against this rule are discussed, and special attention is paid to political feasibility. Justice and political feasibility may contrast, so also in this case. Even if a distribution based only on population may be politically unacceptable, there may be prospects to use this criterion in combination with other rules, as well as to put more weight on it in the future. 26 refs.

  9. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  10. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  11. Insight into nucleon structure from generalized parton distributions

    International Nuclear Information System (INIS)

    J.W. Negele; R.C. Brower; P. Dreher; R. Edwards; G. Fleming; Ph. Hagler; Th. Lippert; A.V.Pochinsky; D.B. Renner; D. Richards; K. Schilling; W. Schroers

    2004-01-01

    The lowest three moments of generalized parton distributions are calculated in full QCD and provide new insight into the behavior of nucleon electromagnetic form factors, the origin of the nucleon spin, and the transverse structure of the nucleon

  12. Multi-objective component sizing of a power-split plug-in hybrid electric vehicle powertrain using Pareto-based natural optimization machines

    Science.gov (United States)

    Mozaffari, Ahmad; Vajedi, Mahyar; Chehresaz, Maryyeh; Azad, Nasser L.

    2016-03-01

    The urgent need to meet increasingly tight environmental regulations and new fuel economy requirements has motivated system science researchers and automotive engineers to take advantage of emerging computational techniques to further advance hybrid electric vehicle and plug-in hybrid electric vehicle (PHEV) designs. In particular, research has focused on vehicle powertrain system design optimization, to reduce the fuel consumption and total energy cost while improving the vehicle's driving performance. In this work, two different natural optimization machines, namely the synchronous self-learning Pareto strategy and the elitism non-dominated sorting genetic algorithm, are implemented for component sizing of a specific power-split PHEV platform with a Toyota plug-in Prius as the baseline vehicle. To do this, a high-fidelity model of the Toyota plug-in Prius is employed for the numerical experiments using the Autonomie simulation software. Based on the simulation results, it is demonstrated that Pareto-based algorithms can successfully optimize the design parameters of the vehicle powertrain.

  13. Pareto-Optimal Multi-objective Inversion of Geophysical Data

    Science.gov (United States)

    Schnaidt, Sebastian; Conway, Dennis; Krieger, Lars; Heinson, Graham

    2018-01-01

    In the process of modelling geophysical properties, jointly inverting different data sets can greatly improve model results, provided that the data sets are compatible, i.e., sensitive to similar features. Such a joint inversion requires a relationship between the different data sets, which can either be analytic or structural. Classically, the joint problem is expressed as a scalar objective function that combines the misfit functions of multiple data sets and a joint term which accounts for the assumed connection between the data sets. This approach suffers from two major disadvantages: first, it can be difficult to assess the compatibility of the data sets and second, the aggregation of misfit terms introduces a weighting of the data sets. We present a pareto-optimal multi-objective joint inversion approach based on an existing genetic algorithm. The algorithm treats each data set as a separate objective, avoiding forced weighting and generating curves of the trade-off between the different objectives. These curves are analysed by their shape and evolution to evaluate data set compatibility. Furthermore, the statistical analysis of the generated solution population provides valuable estimates of model uncertainty.

  14. The Top Tail of the Wealth Distribution in Germany, France, Spain, and Greece

    OpenAIRE

    Bach, Stefan; Thiemann, Andreas; Zucco, Aline

    2015-01-01

    We analyze the top tail of the wealth distribution in Germany, France, Spain, and Greece based on the Household Finance and Consumption Survey (HFCS). Since top wealth is likely to be underrepresented in household surveys we integrate the big fortunes from rich lists, estimate a Pareto distribution, and impute the missing rich. Instead of the Forbes list we mainly rely on national rich lists since they represent a broader base for the big fortunes. As a result, the top percentile share of hou...

  15. Simulating the wealth distribution with a Richest-Following strategy on scale-free network

    Science.gov (United States)

    Hu, Mao-Bin; Jiang, Rui; Wu, Qing-Song; Wu, Yong-Hong

    2007-07-01

    In this paper, we investigate the wealth distribution with agents playing evolutionary games on a scale-free social network adopting the Richest-Following strategy. Pareto's power-law distribution (1897) of wealth is demonstrated with power factor in agreement with that of US or Japan. Moreover, the agent's personal wealth is proportional to its number of contacts (connectivity), and this leads to the phenomenon that the rich gets richer and the poor gets relatively poorer, which agrees with the Matthew Effect.

  16. Optimization of light quality from color mixing light-emitting diode systems for general lighting

    DEFF Research Database (Denmark)

    Thorseth, Anders

    2012-01-01

    are simulated using radiometrically measured single LED spectra. The method uses electrical input powers as input parameters and optimizes the resulting spectral power distribution with regard to color rendering index, correlated color temperature and chromaticity distance. The results indicate Pareto optimal......To address the problem of spectral light quality from color mixing light-emitting diode systems, a method for optimizing the spectral output of multicolor LED system with regards to standardized quality parameters has been developed. The composite spectral power distribution from the LEDs...

  17. Pareto Optimization Identifies Diverse Set of Phosphorylation Signatures Predicting Response to Treatment with Dasatinib.

    Science.gov (United States)

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2015-01-01

    Multivariate biomarkers that can predict the effectiveness of targeted therapy in individual patients are highly desired. Previous biomarker discovery studies have largely focused on the identification of single biomarker signatures, aimed at maximizing prediction accuracy. Here, we present a different approach that identifies multiple biomarkers by simultaneously optimizing their predictive power, number of features, and proximity to the drug target in a protein-protein interaction network. To this end, we incorporated NSGA-II, a fast and elitist multi-objective optimization algorithm that is based on the principle of Pareto optimality, into the biomarker discovery workflow. The method was applied to quantitative phosphoproteome data of 19 non-small cell lung cancer (NSCLC) cell lines from a previous biomarker study. The algorithm successfully identified a total of 77 candidate biomarker signatures predicting response to treatment with dasatinib. Through filtering and similarity clustering, this set was trimmed to four final biomarker signatures, which then were validated on an independent set of breast cancer cell lines. All four candidates reached the same good prediction accuracy (83%) as the originally published biomarker. Although the newly discovered signatures were diverse in their composition and in their size, the central protein of the originally published signature - integrin β4 (ITGB4) - was also present in all four Pareto signatures, confirming its pivotal role in predicting dasatinib response in NSCLC cell lines. In summary, the method presented here allows for a robust and simultaneous identification of multiple multivariate biomarkers that are optimized for prediction performance, size, and relevance.

  18. Pareto optimality between width of central lobe and peak sidelobe intensity in the far-field pattern of lossless phase-only filters for enhancement of transverse resolution.

    Science.gov (United States)

    Mukhopadhyay, Somparna; Hazra, Lakshminarayan

    2015-11-01

    Resolution capability of an optical imaging system can be enhanced by reducing the width of the central lobe of the point spread function. Attempts to achieve the same by pupil plane filtering give rise to a concomitant increase in sidelobe intensity. The mutual exclusivity between these two objectives may be considered as a multiobjective optimization problem that does not have a unique solution; rather, a class of trade-off solutions called Pareto optimal solutions may be generated. Pareto fronts in the synthesis of lossless phase-only pupil plane filters to achieve superresolution with prespecified lower limits for the Strehl ratio are explored by using the particle swarm optimization technique.

  19. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  20. Exponentiated Lomax Geometric Distribution: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Amal Soliman Hassan

    2017-09-01

    Full Text Available In this paper, a new four-parameter lifetime distribution, called the exponentiated Lomax geometric (ELG is introduced. The new lifetime distribution contains the Lomax geometric and exponentiated Pareto geometric as new sub-models. Explicit algebraic formulas of probability density function, survival and hazard functions are derived. Various structural properties of the new model are derived including; quantile function, Re'nyi entropy, moments, probability weighted moments, order statistic, Lorenz and Bonferroni curves. The estimation of the model parameters is performed by maximum likelihood method and inference for a large sample is discussed. The flexibility and potentiality of the new model in comparison with some other distributions are shown via an application to a real data set. We hope that the new model will be an adequate model for applications in various studies.

  1. The global distribution of diet breadth in insect herbivores.

    Science.gov (United States)

    Forister, Matthew L; Novotny, Vojtech; Panorska, Anna K; Baje, Leontine; Basset, Yves; Butterill, Philip T; Cizek, Lukas; Coley, Phyllis D; Dem, Francesca; Diniz, Ivone R; Drozd, Pavel; Fox, Mark; Glassmire, Andrea E; Hazen, Rebecca; Hrcek, Jan; Jahner, Joshua P; Kaman, Ondrej; Kozubowski, Tomasz J; Kursar, Thomas A; Lewis, Owen T; Lill, John; Marquis, Robert J; Miller, Scott E; Morais, Helena C; Murakami, Masashi; Nickel, Herbert; Pardikes, Nicholas A; Ricklefs, Robert E; Singer, Michael S; Smilanich, Angela M; Stireman, John O; Villamarín-Cortez, Santiago; Vodka, Stepan; Volf, Martin; Wagner, David L; Walla, Thomas; Weiblen, George D; Dyer, Lee A

    2015-01-13

    Understanding variation in resource specialization is important for progress on issues that include coevolution, community assembly, ecosystem processes, and the latitudinal gradient of species richness. Herbivorous insects are useful models for studying resource specialization, and the interaction between plants and herbivorous insects is one of the most common and consequential ecological associations on the planet. However, uncertainty persists regarding fundamental features of herbivore diet breadth, including its relationship to latitude and plant species richness. Here, we use a global dataset to investigate host range for over 7,500 insect herbivore species covering a wide taxonomic breadth and interacting with more than 2,000 species of plants in 165 families. We ask whether relatively specialized and generalized herbivores represent a dichotomy rather than a continuum from few to many host families and species attacked and whether diet breadth changes with increasing plant species richness toward the tropics. Across geographic regions and taxonomic subsets of the data, we find that the distribution of diet breadth is fit well by a discrete, truncated Pareto power law characterized by the predominance of specialized herbivores and a long, thin tail of more generalized species. Both the taxonomic and phylogenetic distributions of diet breadth shift globally with latitude, consistent with a higher frequency of specialized insects in tropical regions. We also find that more diverse lineages of plants support assemblages of relatively more specialized herbivores and that the global distribution of plant diversity contributes to but does not fully explain the latitudinal gradient in insect herbivore specialization.

  2. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  3. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  4. Variational principle for the Pareto power law.

    Science.gov (United States)

    Chakraborti, Anirban; Patriarca, Marco

    2009-11-27

    A mechanism is proposed for the appearance of power-law distributions in various complex systems. It is shown that in a conservative mechanical system composed of subsystems with different numbers of degrees of freedom a robust power-law tail can appear in the equilibrium distribution of energy as a result of certain superpositions of the canonical equilibrium energy densities of the subsystems. The derivation only uses a variational principle based on the Boltzmann entropy, without assumptions outside the framework of canonical equilibrium statistical mechanics. Two examples are discussed, free diffusion on a complex network and a kinetic model of wealth exchange. The mechanism is illustrated in the general case through an exactly solvable mechanical model of a dimensionally heterogeneous system.

  5. Effects of heterogeneous wealth distribution on public cooperation with collective risk

    Science.gov (United States)

    Wang, Jing; Fu, Feng; Wang, Long

    2010-07-01

    The distribution of wealth among individuals in real society can be well described by the Pareto principle or “80-20 rule.” How does such heterogeneity in initial wealth distribution affect the emergence of public cooperation, when individuals, the rich and the poor, engage in a collective-risk enterprise, not to gain a profit but to avoid a potential loss? Here we address this issue by studying a simple but effective model based on threshold public goods games. We analyze the evolutionary dynamics for two distinct scenarios, respectively: one with fair sharers versus defectors and the other with altruists versus defectors. For both scenarios, particularly, we in detail study the dynamics of the population with dichotomic initial wealth—the rich versus the poor. Moreover, we demonstrate the possible steady compositions of the population and provide the conditions for stability of these steady states. We prove that in a population with heterogeneous wealth distribution, richer individuals are more likely to cooperate than poorer ones. Participants with lower initial wealth may choose to cooperate only if all players richer than them are cooperators. The emergence of pubic cooperation largely relies on rich individuals. Furthermore, whenever the wealth gap between the rich and the poor is sufficiently large, cooperation of a few rich individuals can substantially elevate the overall level of social cooperation, which is in line with the well-known Pareto principle. Our work may offer an insight into the emergence of cooperative behavior in real social situations where heterogeneous distribution of wealth among individual is omnipresent.

  6. Effects of heterogeneous wealth distribution on public cooperation with collective risk.

    Science.gov (United States)

    Wang, Jing; Fu, Feng; Wang, Long

    2010-07-01

    The distribution of wealth among individuals in real society can be well described by the Pareto principle or "80-20 rule." How does such heterogeneity in initial wealth distribution affect the emergence of public cooperation, when individuals, the rich and the poor, engage in a collective-risk enterprise, not to gain a profit but to avoid a potential loss? Here we address this issue by studying a simple but effective model based on threshold public goods games. We analyze the evolutionary dynamics for two distinct scenarios, respectively: one with fair sharers versus defectors and the other with altruists versus defectors. For both scenarios, particularly, we in detail study the dynamics of the population with dichotomic initial wealth-the rich versus the poor. Moreover, we demonstrate the possible steady compositions of the population and provide the conditions for stability of these steady states. We prove that in a population with heterogeneous wealth distribution, richer individuals are more likely to cooperate than poorer ones. Participants with lower initial wealth may choose to cooperate only if all players richer than them are cooperators. The emergence of pubic cooperation largely relies on rich individuals. Furthermore, whenever the wealth gap between the rich and the poor is sufficiently large, cooperation of a few rich individuals can substantially elevate the overall level of social cooperation, which is in line with the well-known Pareto principle. Our work may offer an insight into the emergence of cooperative behavior in real social situations where heterogeneous distribution of wealth among individual is omnipresent.

  7. Effects of introduction of new resources and fragmentation of existing resources on limiting wealth distribution in asset exchange models

    Science.gov (United States)

    Ali Saif, M.; Gade, Prashant M.

    2009-03-01

    Pareto law, which states that wealth distribution in societies has a power-law tail, has been the subject of intensive investigations in the statistical physics community. Several models have been employed to explain this behavior. However, most of the agent based models assume the conservation of number of agents and wealth. Both these assumptions are unrealistic. In this paper, we study the limiting wealth distribution when one or both of these assumptions are not valid. Given the universality of the law, we have tried to study the wealth distribution from the asset exchange models point of view. We consider models in which (a) new agents enter the market at a constant rate (b) richer agents fragment with higher probability introducing newer agents in the system (c) both fragmentation and entry of new agents is taking place. While models (a) and (c) do not conserve total wealth or number of agents, model (b) conserves total wealth. All these models lead to a power-law tail in the wealth distribution pointing to the possibility that more generalized asset exchange models could help us to explain the emergence of a power-law tail in wealth distribution.

  8. A general algorithm for distributing information in a graph

    OpenAIRE

    Aji, Srinivas M.; McEliece, Robert J.

    1997-01-01

    We present a general “message-passing” algorithm for distributing information in a graph. This algorithm may help us to understand the approximate correctness of both the Gallager-Tanner-Wiberg algorithm, and the turbo-decoding algorithm.

  9. Carbon Lorenz Curves

    Energy Technology Data Exchange (ETDEWEB)

    Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)

    2008-11-15

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.

  10. Pareto genealogies arising from a Poisson branching evolution model with selection.

    Science.gov (United States)

    Huillet, Thierry E

    2014-02-01

    We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.

  11. Hybridization of Sensing Methods of the Search Domain and Adaptive Weighted Sum in the Pareto Approximation Problem

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available We consider the relatively new and rapidly developing class of methods to solve a problem of multi-objective optimization, based on the preliminary built finite-dimensional approximation of the set, and thereby, the Pareto front of this problem as well. The work investigates the efficiency of several modifications of the method of adaptive weighted sum (AWS. This method proposed in the paper of Ryu and Kim Van (JH. Ryu, S. Kim, H. Wan is intended to build Pareto approximation of the multi-objective optimization problem.The AWS method uses quadratic approximation of the objective functions in the current sub-domain of the search space (the area of trust based on the gradient and Hessian matrix of the objective functions. To build the (quadratic meta objective functions this work uses methods of the experimental design theory, which involves calculating the values of these functions in the grid nodes covering the area of trust (a sensing method of the search domain. There are two groups of the sensing methods under consideration: hypercube- and hyper-sphere-based methods. For each of these groups, a number of test multi-objective optimization tasks has been used to study the efficiency of the following grids: "Latin Hypercube"; grid, which is uniformly random for each measurement; grid, based on the LP  sequences.

  12. Choosing the optimal Pareto composition of the charge material for the manufacture of composite blanks

    Science.gov (United States)

    Zalazinsky, A. G.; Kryuchkov, D. I.; Nesterenko, A. V.; Titov, V. G.

    2017-12-01

    The results of an experimental study of the mechanical properties of pressed and sintered briquettes consisting of powders obtained from a high-strength VT-22 titanium alloy by plasma spraying with additives of PTM-1 titanium powder obtained by the hydride-calcium method and powder of PV-N70Yu30 nickel-aluminum alloy are presented. The task is set for the choice of an optimal charge material composition of a composite material providing the required mechanical characteristics and cost of semi-finished products and items. Pareto optimal values for the composition of the composite material charge have been obtained.

  13. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    Science.gov (United States)

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  14. Multi-objective optimization of a series–parallel system using GPSIA

    International Nuclear Information System (INIS)

    Okafor, Ekene Gabriel; Sun Youchao

    2012-01-01

    The optimal solution of a multi-objective optimization problem (MOP) corresponds to a Pareto set that is characterized by a tradeoff between objectives. Genetic Pareto Set Identification Algorithm (GPSIA) proposed for reliability-redundant MOPs is a hybrid technique which combines genetic and heuristic principles to generate non-dominated solutions. Series–parallel system with active redundancy is studied in this paper. Reliability and cost were the research objective functions subject to cost and weight constraints. The results reveal an evenly distributed non-dominated front. The distances between successive Pareto points were used to evaluate the general performance of the method. Plots were also used to show the computational results for the type of system studied and the robustness of the technique is discussed in comparison with NSGA-II and SPEA-2.

  15. Multiobjective planning of distribution networks incorporating switches and protective devices using a memetic optimization

    International Nuclear Information System (INIS)

    Pombo, A. Vieira; Murta-Pina, João; Pires, V. Fernão

    2015-01-01

    A multi-objective planning approach for the reliability of electric distribution networks using a memetic optimization is presented. In this reliability optimization, the type of the equipment (switches or reclosers) and their location are optimized. The multiple objectives considered to find the optimal values for these planning variables are the minimization of the total equipment cost and at the same time the minimization of two distribution network reliability indexes. The reliability indexes are the system average interruption frequency index (SAIFI) and system average interruption duration index (SAIDI). To solve this problem a memetic evolutionary algorithm is proposed, which combines the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) with a local search algorithm. The obtained Pareto-optimal front contains solutions of different trade-offs with respect to the three objectives. A real distribution network is used to test the proposed algorithm. The obtained results show that this approach allows the utility to obtain the optimal type and location of the equipments to achieve the best reliability with the lower cost. - Highlights: • Reliability indexes SAIFI and SAIDI and Equipment Cost are optimized. • Optimization of equipment type, number and location on a MV network. • Memetic evolutionary algorithm with a local search algorithm is proposed. • Pareto optimal front solutions with respect to the three objective functions

  16. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  17. The dosimetric impact of leaf interdigitation and leaf width on VMAT treatment planning in Pinnacle: comparing Pareto fronts

    International Nuclear Information System (INIS)

    Van Kesteren, Z; Janssen, T M; Damen, E; Van Vliet-Vroegindeweij, C

    2012-01-01

    To evaluate in an objective way the effect of leaf interdigitation and leaf width on volumetric modulated arc therapy plans in Pinnacle. Three multileaf collimators (MLCs) were modeled: two 10 mm leaf width MLCs, with and without interdigitating leafs, and a 5 mm leaf width MLC with interdigitating leafs. Three rectum patients and three prostate patients were used for the planning study. In order to compare treatment techniques in an objective way, a Pareto front comparison was carried out. 200 plans were generated in an automated way, per patient per MLC model, resulting in a total of 3600 plans. From these plans, Pareto-optimal plans were selected which were evaluated for various dosimetric variables. The capability of leaf interdigitation showed little dosimetric impact on the treatment plans, when comparing the 10 mm leaf width MLC with and without leaf interdigitation. When comparing the 10 mm leaf width MLC with the 5 mm leaf width MLC, both with interdigitating leafs, improvement in plan quality was observed. For both patient groups, the integral dose was reduced by 0.6 J for the thin MLC. For the prostate patients, the mean dose to the anal sphincter was reduced by 1.8 Gy and the conformity of the V 95% was reduced by 0.02 using the thin MLC. The V 65% of the rectum was reduced by 0.1% and the dose homogeneity with 1.5%. For rectum patients, the mean dose to the bowel was reduced by 1.4 Gy and the mean dose to the bladder with 0.8 Gy for the thin MLC. The conformity of the V 95% was equivalent for the 10 and 5 mm leaf width MLCs for the rectum patients. We have objectively compared three types of MLCs in a planning study for prostate and rectum patients by analyzing Pareto-optimal plans which were generated in an automated way. Interdigitation of MLC leafs does not generate better plans using the SmartArc algorithm in Pinnacle. Changing the MLC leaf width from 10 to 5 mm generates better treatment plans although the clinical relevance remains to be proven

  18. The dosimetric impact of leaf interdigitation and leaf width on VMAT treatment planning in Pinnacle: comparing Pareto fronts.

    Science.gov (United States)

    van Kesteren, Z; Janssen, T M; Damen, E; van Vliet-Vroegindeweij, C

    2012-05-21

    To evaluate in an objective way the effect of leaf interdigitation and leaf width on volumetric modulated arc therapy plans in Pinnacle. Three multileaf collimators (MLCs) were modeled: two 10 mm leaf width MLCs, with and without interdigitating leafs, and a 5 mm leaf width MLC with interdigitating leafs. Three rectum patients and three prostate patients were used for the planning study. In order to compare treatment techniques in an objective way, a Pareto front comparison was carried out. 200 plans were generated in an automated way, per patient per MLC model, resulting in a total of 3600 plans. From these plans, Pareto-optimal plans were selected which were evaluated for various dosimetric variables. The capability of leaf interdigitation showed little dosimetric impact on the treatment plans, when comparing the 10 mm leaf width MLC with and without leaf interdigitation. When comparing the 10 mm leaf width MLC with the 5 mm leaf width MLC, both with interdigitating leafs, improvement in plan quality was observed. For both patient groups, the integral dose was reduced by 0.6 J for the thin MLC. For the prostate patients, the mean dose to the anal sphincter was reduced by 1.8 Gy and the conformity of the V(95%) was reduced by 0.02 using the thin MLC. The V(65%) of the rectum was reduced by 0.1% and the dose homogeneity with 1.5%. For rectum patients, the mean dose to the bowel was reduced by 1.4 Gy and the mean dose to the bladder with 0.8 Gy for the thin MLC. The conformity of the V(95%) was equivalent for the 10 and 5 mm leaf width MLCs for the rectum patients. We have objectively compared three types of MLCs in a planning study for prostate and rectum patients by analyzing Pareto-optimal plans which were generated in an automated way. Interdigitation of MLC leafs does not generate better plans using the SmartArc algorithm in Pinnacle. Changing the MLC leaf width from 10 to 5 mm generates better treatment plans although the clinical relevance remains

  19. Citation distribution profile in Brazilian journals of general medicine.

    Science.gov (United States)

    Lustosa, Luiggi Araujo; Chalco, Mario Edmundo Pastrana; Borba, Cecília de Melo; Higa, André Eizo; Almeida, Renan Moritz Varnier Rodrigues

    2012-01-01

    Impact factors are currently the bibliometric index most used for evaluating scientific journals. However, the way in which they are used, for instance concerning the study or journal types analyzed, can markedly interfere with estimate reliability. This study aimed to analyze the citation distribution pattern in three Brazilian journals of general medicine. This was a descriptive study based on numbers of citations of scientific studies published by three Brazilian journals of general medicine. The journals analyzed were São Paulo Medical Journal, Clinics and Revista da Associação Médica Brasileira. This survey used data available from the Institute for Scientific Information (ISI) platform, from which the total number of papers published in each journal in 2007-2008 and the number of citations of these papers in 2009 were obtained. From these data, the citation distribution was derived and journal impact factors (average number of citations) were estimated. These factors were then compared with those directly available from the ISI Journal of Citation Reports (JCR). Respectively, 134, 203 and 192 papers were published by these journals during the period analyzed. The observed citation distributions were highly skewed, such that many papers had few citations and a small percentage had many citations. It was not possible to identify any specific pattern for the most cited papers or to exactly reproduce the JCR impact factors. Use of measures like "impact factors", which characterize citations through averages, does not adequately represent the citation distribution in the journals analyzed.

  20. Multi-objective optimal strategy for generating and bidding in the power market

    International Nuclear Information System (INIS)

    Peng Chunhua; Sun Huijuan; Guo Jianfeng; Liu Gang

    2012-01-01

    Highlights: ► A new benefit/risk/emission comprehensive generation optimization model is established. ► A hybrid multi-objective differential evolution optimization algorithm is designed. ► Fuzzy set theory and entropy weighting method are employed to extract the general best solution. ► The proposed approach of generating and bidding is efficient for maximizing profit and minimizing both risk and emissions. - Abstract: Based on the coordinated interaction between units output and electricity market prices, the benefit/risk/emission comprehensive generation optimization model with objectives of maximal profit and minimal bidding risk and emissions is established. A hybrid multi-objective differential evolution optimization algorithm, which successfully integrates Pareto non-dominated sorting with differential evolution algorithm and improves individual crowding distance mechanism and mutation strategy to avoid premature and unevenly search, is designed to achieve Pareto optimal set of this model. Moreover, fuzzy set theory and entropy weighting method are employed to extract one of the Pareto optimal solutions as the general best solution. Several optimization runs have been carried out on different cases of generation bidding and scheduling. The results confirm the potential and effectiveness of the proposed approach in solving the multi-objective optimization problem of generation bidding and scheduling. In addition, the comparison with the classical optimization algorithms demonstrates the superiorities of the proposed algorithm such as integrality of Pareto front, well-distributed Pareto-optimal solutions, high search speed.

  1. The El Niño Southern Oscillation index and wildfire prediction in British Columbia

    NARCIS (Netherlands)

    Xu, Zhen; Kooten, van G.C.

    2014-01-01

    This study investigates the potential to predict monthly wildfires and area burned in British Columbia's interior using El Niño Southern Oscillation (ENSO). The zero-inflated negative binomial (ZINB) and the generalized Pareto (GP) distributions are used, respectively, to account for uncertainty in

  2. A GENERALIZED CLASS OF TRANSFORMATION MATRICES FOR THE RECONSTRUCTION OF SPHERE SIZE DISTRIBUTIONS FROM SECTION CIRCLE SIZE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Willi Pabst

    2017-03-01

    Full Text Available A generalized formulation of transformation matrices is given for the reconstruction of sphere diameter distributions from their section circle diameter distributions. This generalized formulation is based on a weight shift parameter that can be adjusted from 0 to 1. It includes the well-known Saltykov and Cruz-Orive transformations as special cases (for parameter values of 0 and 0.5, respectively. The physical meaning of this generalization is explained (showing, among others, that the Woodhead transformation should be bounded by the Saltykov transformation on the one side and by our transformation from the other and its numerical performance is investigated. In particular, it is shown that our generalized transformation is numerically highly unstable, i.e. introduces numerical artefacts (oscillations or even unphysical negative sphere frequencies into the reconstruction, and can lead to completely wrong results when a critical value of the parameter (usually in the range 0.7-0.9, depending on the type of distribution is exceeded. It is shown that this numerical instability is an intrinsic feature of these transformations that depends not only on the weight shift parameter value and is affected both by the type and the position of the distribution. It occurs in a natural way also for the Cruz-Orive and other transformations with finite weight shift parameter values and is not just caused by inadequate input data (e.g. as a consequence of an insufficient number of objects counted, as commonly assumed. Finally it is shown that an even more general class of transformation matrices can be defined that includes, in addition to the aformentioned transformations, also the Wicksell transformation.

  3. Using Coevolution Genetic Algorithm with Pareto Principles to Solve Project Scheduling Problem under Duration and Cost Constraints

    Directory of Open Access Journals (Sweden)

    Alexandr Victorovich Budylskiy

    2014-06-01

    Full Text Available This article considers the multicriteria optimization approach using the modified genetic algorithm to solve the project-scheduling problem under duration and cost constraints. The work contains the list of choices for solving this problem. The multicriteria optimization approach is justified here. The study describes the Pareto principles, which are used in the modified genetic algorithm. We identify the mathematical model of the project-scheduling problem. We introduced the modified genetic algorithm, the ranking strategies, the elitism approaches. The article includes the example.

  4. Feasibility of estimating generalized extreme-value distribution of floods

    International Nuclear Information System (INIS)

    Ferreira de Queiroz, Manoel Moises

    2004-01-01

    Flood frequency analysis by generalized extreme-value probability distribution (GEV) has found increased application in recent years, given its flexibility in dealing with the three asymptotic forms of extreme distribution derived from different initial probability distributions. Estimation of higher quantiles of floods is usually accomplished by extrapolating one of the three inverse forms of GEV distribution fitted to the experimental data for return periods much higher than those actually observed. This paper studies the feasibility of fitting GEV distribution by moments of linear combinations of higher order statistics (LH moments) using synthetic annual flood series with varying characteristics and lengths. As the hydrologic events in nature such as daily discharge occur with finite values, their annual maximums are expected to follow the asymptotic form of the limited GEV distribution. Synthetic annual flood series were thus obtained from the stochastic sequences of 365 daily discharges generated by Monte Carlo simulation on the basis of limited probability distribution underlying the limited GEV distribution. The results show that parameter estimation by LH moments of this distribution, fitted to annual flood samples of less than 100-year length derived from initial limited distribution, may indicate any form of extreme-value distribution, not just the limited form as expected, and with large uncertainty in fitted parameters. A frequency analysis, on the basis of GEV distribution and LH moments, of annual flood series of lengths varying between 13 and 73 years observed at 88 gauge stations on Parana River in Brazil, indicated all the three forms of GEV distribution.(Author)

  5. Visualising Pareto-optimal trade-offs helps move beyond monetary-only criteria for water management decisions

    Science.gov (United States)

    Hurford, Anthony; Harou, Julien

    2014-05-01

    Water related eco-system services are important to the livelihoods of the poorest sectors of society in developing countries. Degradation or loss of these services can increase the vulnerability of people decreasing their capacity to support themselves. New approaches to help guide water resources management decisions are needed which account for the non-market value of ecosystem goods and services. In case studies from Brazil and Kenya we demonstrate the capability of many objective Pareto-optimal trade-off analysis to help decision makers balance economic and non-market benefits from the management of existing multi-reservoir systems. A multi-criteria search algorithm is coupled to a water resources management simulator of each basin to generate a set of Pareto-approximate trade-offs representing the best case management decisions. In both cases, volume dependent reservoir release rules are the management decisions being optimised. In the Kenyan case we further assess the impacts of proposed irrigation investments, and how the possibility of new investments impacts the system's trade-offs. During the multi-criteria search (optimisation), performance of different sets of management decisions (policies) is assessed against case-specific objective functions representing provision of water supply and irrigation, hydropower generation and maintenance of ecosystem services. Results are visualised as trade-off surfaces to help decision makers understand the impacts of different policies on a broad range of stakeholders and to assist in decision-making. These case studies show how the approach can reveal unexpected opportunities for win-win solutions, and quantify the trade-offs between investing to increase agricultural revenue and negative impacts on protected ecosystems which support rural livelihoods.

  6. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity.

    Directory of Open Access Journals (Sweden)

    James D Englehardt

    Full Text Available Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a toxicokinetic models, (b biologically-based network models, (c scholastic and psychological test score data for children with prenatal mercury exposure, and (d time-to-tumor data of the ED01 study.

  7. On the distributions of annual and seasonal daily rainfall extremes in central Arizona and their spatial variability

    Science.gov (United States)

    Mascaro, Giuseppe

    2018-04-01

    This study uses daily rainfall records of a dense network of 240 gauges in central Arizona to gain insights on (i) the variability of the seasonal distributions of rainfall extremes; (ii) how the seasonal distributions affect the shape of the annual distribution; and (iii) the presence of spatial patterns and orographic control for these distributions. For this aim, recent methodological advancements in peak-over-threshold analysis and application of the Generalized Pareto Distribution (GPD) were used to assess the suitability of the GPD hypothesis and improve the estimation of its parameters, while limiting the effect of short sample sizes. The distribution of daily rainfall extremes was found to be heavy-tailed (i.e., GPD shape parameter ξ > 0) during the summer season, dominated by convective monsoonal thunderstorms. The exponential distribution (a special case of GPD with ξ = 0) was instead showed to be appropriate for modeling wintertime daily rainfall extremes, mainly caused by cold fronts transported by westerly flow. The annual distribution exhibited a mixed behavior, with lighter upper tails than those found in summer. A hybrid model mixing the two seasonal distributions was demonstrated capable of reproducing the annual distribution. Organized spatial patterns, mainly controlled by elevation, were observed for the GPD scale parameter, while ξ did not show any clear control of location or orography. The quantiles returned by the GPD were found to be very similar to those provided by the National Oceanic and Atmospheric Administration (NOAA) Atlas 14, which used the Generalized Extreme Value (GEV) distribution. Results of this work are useful to improve statistical modeling of daily rainfall extremes at high spatial resolution and provide diagnostic tools for assessing the ability of climate models to simulate extreme events.

  8. General distributed control system for fusion experiments

    International Nuclear Information System (INIS)

    Klingner, P.L.; Levings, S.J.; Wilkins, R.W.

    1986-01-01

    A general control system using distributed LSI-11 microprocessors is being developed. Common software residues in each LSI-11 and is tailored to an application by control specifications downloaded from a host computer. The microprocessors, their control interfaces, and the micro-to-host communications are CAMAC based. The host computer also supports an operator interface, coordination of multiple microprocessors, and utilities to create and maintain the control specifications. Typical applications include monitoring safety interlocks as well as controlling vacuum systems, high voltage charging systems, and diagnostics

  9. Energy distributions of Bianchi type-VIh Universe in general relativity ...

    Indian Academy of Sciences (India)

    2017-03-16

    Mar 16, 2017 ... butions in Bianchi type-VIh metric for different gravitation theories. ... Bianchi VIh Universe; general relativity; teleparallel gravity; energy–momentum distribution. ... In §3, we introduce energy–momentum definitions of Einstein,.

  10. MO-G-304-04: Generating Well-Dispersed Representations of the Pareto Front for Multi-Criteria Optimization in Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Kirlik, G; Zhang, H [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: To present a novel multi-criteria optimization (MCO) solution approach that generates well-dispersed representation of the Pareto front for radiation treatment planning. Methods: Different algorithms have been proposed and implemented in commercial planning software to generate MCO plans for external-beam radiation therapy. These algorithms consider convex optimization problems. We propose a grid-based algorithm to generate well-dispersed treatment plans over Pareto front. Our method is able to handle nonconvexity in the problem to deal with dose-volume objectives/constraints, biological objectives, such as equivalent uniform dose (EUD), tumor control probability (TCP), normal tissue complication probability (NTCP), etc. In addition, our algorithm is able to provide single MCO plan when clinicians are targeting narrow bounds of objectives for patients. In this situation, usually none of the generated plans were within the bounds and a solution is difficult to identify via manual navigation. We use the subproblem formulation utilized in the grid-based algorithm to obtain a plan within the specified bounds. The subproblem aims to generate a solution that maps into the rectangle defined by the bounds. If such a solution does not exist, it generates the solution closest to the rectangle. We tested our method with 10 locally advanced head and neck cancer cases. Results: 8 objectives were used including 3 different objectives for primary target volume, high-risk and low-risk target volumes, and 5 objectives for each of the organs-at-risk (OARs) (two parotids, spinal cord, brain stem and oral cavity). Given tight bounds, uniform dose was achieved for all targets while as much as 26% improvement was achieved in OAR sparing comparing to clinical plans without MCO and previously proposed MCO method. Conclusion: Our method is able to obtain well-dispersed treatment plans to attain better approximation for convex and nonconvex Pareto fronts. Single treatment plan can

  11. Trading leads to scale-free self-organization

    Science.gov (United States)

    Ebert, M.; Paul, W.

    2012-12-01

    Financial markets display scale-free behavior in many different aspects. The power-law behavior of part of the distribution of individual wealth has been recognized by Pareto as early as the nineteenth century. Heavy-tailed and scale-free behavior of the distribution of returns of different financial assets have been confirmed in a series of works. The existence of a Pareto-like distribution of the wealth of market participants has been connected with the scale-free distribution of trading volumes and price-returns. The origin of the Pareto-like wealth distribution, however, remained obscure. Here we show that in a market where the imbalance of supply and demand determines the direction of prize changes, it is the process of trading itself that spontaneously leads to a self-organization of the market with a Pareto-like wealth distribution for the market participants and at the same time to a scale-free behavior of return fluctuations and trading volume distributions.

  12. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  13. Power Law Distributions in the Experiment for Adjustment of the Ion Source of the NBI System

    International Nuclear Information System (INIS)

    Han Xiaopu; Hu Chundong

    2005-01-01

    The experiential adjustment process in an experiment on the ion source of the neutral beam injector system for the HT-7 Tokamak is reported in this paper. With regard to the data obtained in the same condition, in arranging the arc current intensities of every shot with a decay rank, the distributions of the arc current intensity correspond to the power laws, and the distribution obtained in the condition with the cryo-pump corresponds to the double Pareto distribution. Using the similar study method, the distributions of the arc duration are close to the power laws too. These power law distributions are formed rather naturally instead of being the results of purposeful seeking

  14. Evaluation of the optimal combinations of modulation factor and pitch for Helical TomoTherapy plans made with TomoEdge using Pareto optimal fronts.

    Science.gov (United States)

    De Kerf, Geert; Van Gestel, Dirk; Mommaerts, Lobke; Van den Weyngaert, Danielle; Verellen, Dirk

    2015-09-17

    Modulation factor (MF) and pitch have an impact on Helical TomoTherapy (HT) plan quality and HT users mostly use vendor-recommended settings. This study analyses the effect of these two parameters on both plan quality and treatment time for plans made with TomoEdge planning software by using the concept of Pareto optimal fronts. More than 450 plans with different combinations of pitch [0.10-0.50] and MF [1.2-3.0] were produced. These HT plans, with a field width (FW) of 5 cm, were created for five head and neck patients and homogeneity index, conformity index, dose-near-maximum (D2), and dose-near-minimum (D98) were analysed for the planning target volumes, as well as the mean dose and D2 for most critical organs at risk. For every dose metric the median value will be plotted against treatment time. A Pareto-like method is used in the analysis which will show how pitch and MF influence both treatment time and plan quality. For small pitches (≤0.20), MF does not influence treatment time. The contrary is true for larger pitches (≥0.25) as lowering MF will both decrease treatment time and plan quality until maximum gantry speed is reached. At this moment, treatment time is saturated and only plan quality will further decrease. The Pareto front analysis showed optimal combinations of pitch [0.23-0.45] and MF > 2.0 for a FW of 5 cm. Outside this range, plans will become less optimal. As the vendor-recommended settings fall within this range, the use of these settings is validated.

  15. Gamma processes and peaks-over-threshold distributions for time-dependent reliability

    International Nuclear Information System (INIS)

    Noortwijk, J.M. van; Weide, J.A.M. van der; Kallen, M.J.; Pandey, M.D.

    2007-01-01

    In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability

  16. Building an Ensemble Seismic Hazard Model for the Magnitude Distribution by Using Alternative Bayesian Implementations

    Science.gov (United States)

    Taroni, M.; Selva, J.

    2017-12-01

    In this work we show how we built an ensemble seismic hazard model for the magnitude distribution for the TSUMAPS-NEAM EU project (http://www.tsumaps-neam.eu/). The considered source area includes the whole NEAM region (North East Atlantic, Mediterranean and connected seas). We build our models by using the catalogs (EMEC and ISC), their completeness and the regionalization provided by the project. We developed four alternative implementations of a Bayesian model, considering tapered or truncated Gutenberg-Richter distributions, and fixed or variable b-value. The frequency size distribution is based on the Weichert formulation. This allows for simultaneously assessing all the frequency-size distribution parameters (a-value, b-value, and corner magnitude), using multiple completeness periods for the different magnitudes. With respect to previous studies, we introduce the tapered Pareto distribution (in addition to the classical truncated Pareto), and we build a novel approach to quantify the prior distribution. For each alternative implementation, we set the prior distributions using the global seismic data grouped according to the different types of tectonic setting, and assigned them to the related regions. The estimation is based on the complete (not declustered) local catalog in each region. Using the complete catalog also allows us to consider foreshocks and aftershocks in the seismic rate computation: the Poissonicity of the tsunami events (and similarly the exceedances of the PGA) will be insured by the Le Cam's theorem. This Bayesian approach provides robust estimations also in the zones where few events are available, but also leaves us the possibility to explore the uncertainty associated with the estimation of the magnitude distribution parameters (e.g. with the classical Metropolis-Hastings Monte Carlo method). Finally we merge all the models with their uncertainty to create the ensemble model that represents our knowledge of the seismicity in the

  17. Distribution Network Expansion Planning Based on Multi-objective PSO Algorithm

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Wu, Qiuwei

    2013-01-01

    This paper presents a novel approach for electrical distribution network expansion planning using multi-objective particle swarm optimization (PSO). The optimization objectives are: investment and operation cost, energy losses cost, and power congestion cost. A two-phase multi-objective PSO...... algorithm was proposed to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of both the proposed multi-objective planning approach and the improved multi-objective PSO have been verified...

  18. Citation distribution profile in Brazilian journals of general medicine

    Directory of Open Access Journals (Sweden)

    Luiggi Araujo Lustosa

    Full Text Available CONTEXT AND OBJECTIVE: Impact factors are currently the bibliometric index most used for evaluating scientific journals. However, the way in which they are used, for instance concerning the study or journal types analyzed, can markedly interfere with estimate reliability. This study aimed to analyze the citation distribution pattern in three Brazilian journals of general medicine. DESIGN AND SETTING: This was a descriptive study based on numbers of citations of scientific studies published by three Brazilian journals of general medicine. METHODS: The journals analyzed were São Paulo Medical Journal, Clinics and Revista da Associação Médica Brasileira. This survey used data available from the Institute for Scientific Information (ISI platform, from which the total number of papers published in each journal in 2007-2008 and the number of citations of these papers in 2009 were obtained. From these data, the citation distribution was derived and journal impact factors (average number of citations were estimated. These factors were then compared with those directly available from the ISI Journal of Citation Reports (JCR. RESULTS: Respectively, 134, 203 and 192 papers were published by these journals during the period analyzed. The observed citation distributions were highly skewed, such that many papers had few citations and a small percentage had many citations. It was not possible to identify any specific pattern for the most cited papers or to exactly reproduce the JCR impact factors. CONCLUSION: Use of measures like "impact factors", which characterize citations through averages, does not adequately represent the citation distribution in the journals analyzed.

  19. An extension of the directed search domain algorithm to bilevel optimization

    Science.gov (United States)

    Wang, Kaiqiang; Utyuzhnikov, Sergey V.

    2017-08-01

    A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.

  20. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  1. Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis

    DEFF Research Database (Denmark)

    Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei

    2018-01-01

    For the existing pitch and torque control of the wind turbine generator system (WTGS), further development on coordinated control is necessary to improve effectiveness for practical applications. In this paper, the WTGS is modeled as a coupling combination of two subsystems: the generator torque...... control subsystem and blade pitch control subsystem. Then, the pole positions in each control subsystem are adjusted coordinately to evaluate the controller participation and used as the objective of optimization. A two-level parameters-controllers coordinated optimization scheme is proposed and applied...... to optimize the controller coordination based on the Pareto optimization theory. Three solutions are obtained through optimization, which includes the optimal torque solution, optimal power solution, and satisfactory solution. Detailed comparisons evaluate the performance of the three selected solutions...

  2. Multi-criteria Ranking Under Pareto Inclusive Criterion of Preference: An Application in Ranking Some Fungi Species with Respect to Their Toxicity

    Directory of Open Access Journals (Sweden)

    Gniadek Agnieszka

    2014-12-01

    Full Text Available This study aims at demonstrating the usefulness of the Pareto in- clusive criterion methodology for comparative analyses of fungi toxicity. The toxicity of fungi is usually measured using a scale of several ranks. In practice, the ranks of toxicity are routinely grouped into only four conventional classes of toxicity: from a class of no toxicity, low toxicity, and moderate toxicity, to a class of high toxicity. The illustrative material included the N = 61 fungi samples obtained from three species: A. ochraceus, A. niger and A. flavus. In accordance with the Pareto approach, four partial criterions of the worst toxi- city were defined, a single criterion used for each conventional class of toxicity. Finally, the odds ratios (OR were calculated separately for each partial cri- terion, and the significance of the hypotheses OR = 1 was estimated. It was stated that A. ochraceus fungi are distinctly more toxic than the two remaining ones with respect to the all considered four partial criterions, with significance equal to p = 0.04, p = 0.04, p = 0.007 and p = 0.005, respectively. Thus, the suggested method illustrated its utility in the case under study.

  3. Feynman quasi probability distribution for spin-(1/2), and its generalizations

    International Nuclear Information System (INIS)

    Colucci, M.

    1999-01-01

    It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects

  4. Chiral perturbation theory for nucleon generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Manashov, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik]|[Sankt-Petersburg State Univ. (Russian Federation). Dept. of Theoretical Physics; Schaefer, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik

    2006-08-15

    We analyze the moments of the isosinglet generalized parton distributions H, E, H, E of the nucleon in one-loop order of heavy-baryon chiral perturbation theory. We discuss in detail the construction of the operators in the effective theory that are required to obtain all corrections to a given order in the chiral power counting. The results will serve to improve the extrapolation of lattice results to the chiral limit. (orig.)

  5. Robust Hydrological Forecasting for High-resolution Distributed Models Using a Unified Data Assimilation Approach

    Science.gov (United States)

    Hernandez, F.; Liang, X.

    2017-12-01

    Reliable real-time hydrological forecasting, to predict important phenomena such as floods, is invaluable to the society. However, modern high-resolution distributed models have faced challenges when dealing with uncertainties that are caused by the large number of parameters and initial state estimations involved. Therefore, to rely on these high-resolution models for critical real-time forecast applications, considerable improvements on the parameter and initial state estimation techniques must be made. In this work we present a unified data assimilation algorithm called Optimized PareTo Inverse Modeling through Inverse STochastic Search (OPTIMISTS) to deal with the challenge of having robust flood forecasting for high-resolution distributed models. This new algorithm combines the advantages of particle filters and variational methods in a unique way to overcome their individual weaknesses. The analysis of candidate particles compares model results with observations in a flexible time frame, and a multi-objective approach is proposed which attempts to simultaneously minimize differences with the observations and departures from the background states by using both Bayesian sampling and non-convex evolutionary optimization. Moreover, the resulting Pareto front is given a probabilistic interpretation through kernel density estimation to create a non-Gaussian distribution of the states. OPTIMISTS was tested on a low-resolution distributed land surface model using VIC (Variable Infiltration Capacity) and on a high-resolution distributed hydrological model using the DHSVM (Distributed Hydrology Soil Vegetation Model). In the tests streamflow observations are assimilated. OPTIMISTS was also compared with a traditional particle filter and a variational method. Results show that our method can reliably produce adequate forecasts and that it is able to outperform those resulting from assimilating the observations using a particle filter or an evolutionary 4D variational

  6. A framework to identify Pareto-efficient subdaily environmental flow constraints on hydropower reservoirs using a grid-wide power dispatch model

    Science.gov (United States)

    Olivares, Marcelo A.; Haas, Jannik; Palma-Behnke, Rodrigo; Benavides, Carlos

    2015-05-01

    Hydrologic alteration due to hydropeaking reservoir operations is a main concern worldwide. Subdaily environmental flow constraints (ECs) on operations can be promising alternatives for mitigating negative impacts. However, those constraints reduce the flexibility of hydropower plants, potentially with higher costs for the power system. To study the economic and environmental efficiency of ECs, this work proposes a novel framework comprising four steps: (i) assessment of the current subdaily hydrologic alteration; (ii) formulation and implementation of a short-term, grid-wide hydrothermal coordination model; (iii) design of ECs in the form of maximum ramping rates (MRRs) and minimum flows (MIFs) for selected hydropower reservoirs; and (iv) identification of Pareto-efficient solutions in terms of grid-wide costs and the Richard-Baker flashiness index for subdaily hydrologic alteration (SDHA). The framework was applied to Chile's main power grid, assessing 25 EC cases, involving five MIFs and five MRRs. Each case was run for a dry, normal, and wet water year type. Three Pareto-efficient ECs are found, with remarkably small cost increase below 2% and a SDHA improvement between 28% and 90%. While the case involving the highest MIF worsens the flashiness of another basin, the other two have no negative effect on other basins and can be recommended for implementation.

  7. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modeling heteroscedastic residual errors

    Science.gov (United States)

    McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George

    2017-03-01

    Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  8. Economic-environmental energy and reserve scheduling of smart distribution systems: A multiobjective mathematical programming approach

    International Nuclear Information System (INIS)

    Zakariazadeh, Alireza; Jadid, Shahram; Siano, Pierluigi

    2014-01-01

    Highlights: • Environmental/economical scheduling of energy and reserve. • Simultaneous participation of loads in both energy and reserve scheduling. • Aggregate wind generation and demand uncertainties in a stochastic model. • Stochastic scheduling of energy and reserve in a distribution system. • Demand response providers’ participation in energy and reserve scheduling. - Abstract: In this paper a stochastic multi-objective economical/environmental operational scheduling method is proposed to schedule energy and reserve in a smart distribution system with high penetration of wind generation. The proposed multi-objective framework, based on augmented ε-constraint method, is used to minimize the total operational costs and emissions and to generate Pareto-optimal solutions for the energy and reserve scheduling problem. Moreover, fuzzy decision making process is employed to extract one of the Pareto-optimal solutions as the best compromise non-dominated solution. The wind power and demand forecast errors are considered in this approach and the reserve can be furnished by the main grid as well as distributed generators and responsive loads. The consumers participate in both energy and reserve markets using various demand response programs. In order to facilitate small and medium loads participation in demand response programs, a Demand Response Provider (DRP) aggregates offers for load reduction. In order to solve the proposed optimization model, the Benders decomposition technique is used to convert the large scale mixed integer non-linear problem into mixed-integer linear programming and non-linear programming problems. The effectiveness of the proposed scheduling approach is verified on a 41-bus distribution test system over a 24-h period

  9. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  10. Evaluation of treatment plan quality of IMRT and VMAT with and without flattening filter using Pareto optimal fronts.

    Science.gov (United States)

    Lechner, Wolfgang; Kragl, Gabriele; Georg, Dietmar

    2013-12-01

    To investigate the differences in treatment plan quality of IMRT and VMAT with and without flattening filter using Pareto optimal fronts, for two treatment sites of different anatomic complexity. Pareto optimal fronts (POFs) were generated for six prostate and head-and-neck cancer patients by stepwise reduction of the constraint (during the optimization process) of the primary organ-at-risk (OAR). 9-static field IMRT and 360°-single-arc VMAT plans with flattening filter (FF) and without flattening filter (FFF) were compared. The volume receiving 5 Gy or more (V5 Gy) was used to estimate the low dose exposure. Furthermore, the number of monitor units (MUs) and measurements of the delivery time (T) were used to assess the efficiency of the treatment plans. A significant increase in MUs was found when using FFF-beams while the treatment plan quality was at least equivalent to the FF-beams. T was decreased by 18% for prostate for IMRT with FFF-beams and by 4% for head-and-neck cases, but increased by 22% and 16% for VMAT. A reduction of up to 5% of V5 Gy was found for IMRT prostate cases with FFF-beams. The evaluation of the POFs showed an at least comparable treatment plan quality of FFF-beams compared to FF-beams for both treatment sites and modalities. For smaller targets the advantageous characteristics of FFF-beams could be better exploited. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Generic features of the wealth distribution in ideal-gas-like markets.

    Science.gov (United States)

    Mohanty, P K

    2006-07-01

    We provide an exact solution to the ideal-gas-like models studied in econophysics to understand the microscopic origin of Pareto law. In these classes of models the key ingredient necessary for having a self-organized scale-free steady-state distribution is the trading or collision rule where agents or particles save a definite fraction of their wealth or energy and invest the rest for trading. Using a Gibbs ensemble approach we could obtain the exact distribution of wealth in this model. Moreover we show that in this model (a) good savers are always rich and (b) every agent poor or rich invests the same amount for trading. Nonlinear trading rules could alter the generic scenario observed here.

  12. Distributed Generation Planning using Peer Enhanced Multi-objective Teaching-Learning based Optimization in Distribution Networks

    Science.gov (United States)

    Selvam, Kayalvizhi; Vinod Kumar, D. M.; Siripuram, Ramakanth

    2017-04-01

    In this paper, an optimization technique called peer enhanced teaching learning based optimization (PeTLBO) algorithm is used in multi-objective problem domain. The PeTLBO algorithm is parameter less so it reduced the computational burden. The proposed peer enhanced multi-objective based TLBO (PeMOTLBO) algorithm has been utilized to find a set of non-dominated optimal solutions [distributed generation (DG) location and sizing in distribution network]. The objectives considered are: real power loss and the voltage deviation subjected to voltage limits and maximum penetration level of DG in distribution network. Since the DG considered is capable of injecting real and reactive power to the distribution network the power factor is considered as 0.85 lead. The proposed peer enhanced multi-objective optimization technique provides different trade-off solutions in order to find the best compromise solution a fuzzy set theory approach has been used. The effectiveness of this proposed PeMOTLBO is tested on IEEE 33-bus and Indian 85-bus distribution system. The performance is validated with Pareto fronts and two performance metrics (C-metric and S-metric) by comparing with robust multi-objective technique called non-dominated sorting genetic algorithm-II and also with the basic TLBO.

  13. The mathematical formula of the intravaginal ejaculation latency time (IELT distribution of lifelong premature ejaculation differs from the IELT distribution formula of men in the general male population

    Directory of Open Access Journals (Sweden)

    Paddy K.C. Janssen

    2016-03-01

    Full Text Available Purpose: To find the most accurate mathematical description of the intravaginal ejaculation latency time (IELT distribution in the general male population. Materials and Methods: We compared the fitness of various well-known mathematical distributions with the IELT distribution of two previously published stopwatch studies of the Caucasian general male population and a stopwatch study of Dutch Caucasian men with lifelong premature ejaculation (PE. The accuracy of fitness is expressed by the Goodness of Fit (GOF. The smaller the GOF, the more accurate is the fitness. Results: The 3 IELT distributions are gamma distributions, but the IELT distribution of lifelong PE is another gamma distribution than the IELT distribution of men in the general male population. The Lognormal distribution of the gamma distributions most accurately fits the IELT distribution of 965 men in the general population, with a GOF of 0.057. The Gumbel Max distribution most accurately fits the IELT distribution of 110 men with lifelong PE with a GOF of 0.179. There are more men with lifelong PE ejaculating within 30 and 60 seconds than can be extrapolated from the probability density curve of the Lognormal IELT distribution of men in the general population. Conclusions: Men with lifelong PE have a distinct IELT distribution, e.g., a Gumbel Max IELT distribution, that can only be retrieved from the general male population Lognormal IELT distribution when thousands of men would participate in a IELT stopwatch study. The mathematical formula of the Lognormal IELT distribution is useful for epidemiological research of the IELT.

  14. The mathematical formula of the intravaginal ejaculation latency time (IELT) distribution of lifelong premature ejaculation differs from the IELT distribution formula of men in the general male population

    Science.gov (United States)

    Janssen, Paddy K.C.

    2016-01-01

    Purpose To find the most accurate mathematical description of the intravaginal ejaculation latency time (IELT) distribution in the general male population. Materials and Methods We compared the fitness of various well-known mathematical distributions with the IELT distribution of two previously published stopwatch studies of the Caucasian general male population and a stopwatch study of Dutch Caucasian men with lifelong premature ejaculation (PE). The accuracy of fitness is expressed by the Goodness of Fit (GOF). The smaller the GOF, the more accurate is the fitness. Results The 3 IELT distributions are gamma distributions, but the IELT distribution of lifelong PE is another gamma distribution than the IELT distribution of men in the general male population. The Lognormal distribution of the gamma distributions most accurately fits the IELT distribution of 965 men in the general population, with a GOF of 0.057. The Gumbel Max distribution most accurately fits the IELT distribution of 110 men with lifelong PE with a GOF of 0.179. There are more men with lifelong PE ejaculating within 30 and 60 seconds than can be extrapolated from the probability density curve of the Lognormal IELT distribution of men in the general population. Conclusions Men with lifelong PE have a distinct IELT distribution, e.g., a Gumbel Max IELT distribution, that can only be retrieved from the general male population Lognormal IELT distribution when thousands of men would participate in a IELT stopwatch study. The mathematical formula of the Lognormal IELT distribution is useful for epidemiological research of the IELT. PMID:26981594

  15. Generalized parton distribution for non zero skewness

    International Nuclear Information System (INIS)

    Kumar, Narinder; Dahiya, Harleen; Teryaev, Oleg

    2012-01-01

    In the theory of strong interactions the main open question is how the nucleon and other hadrons are built from quarks and gluons, the fundamental degrees of freedom in QCD. An essential tool to investigate hadron structure is the study of deep inelastic scattering processes, where individual quarks and gluons can be resolved. The parton densities extracted from such processes encode the distribution of longitudinal momentum and polarization carried by quarks, antiquarks and gluons within a fast moving hadron. They have provided much to shape the physical picture of hadron structure. In the recent years, it has become clear that appropriate exclusive scattering processes may provide such information encoded in the general parton distributions (GPDs). Here, we investigate the GPD for deep virtual compton scattering (DVCS) for the non zero skewness. The study has investigated the GPDs by expressing them in terms of overlaps of light front wave functions (LFWFs). The work represented a spin 1/2 system as a composite of spin 1/2 fermion and spin 1 boson with arbitrary masses

  16. Distributed Systems of Generalizing as the Basis of Workplace Learning

    Science.gov (United States)

    Virkkunen, Jaakko; Pihlaja, Juha

    2004-01-01

    This article proposes a new way of conceptualizing workplace learning as distributed systems of appropriation, development and the use of practice-relevant generalizations fixed within mediational artifacts. This article maintains that these systems change historically as technology and increasingly sophisticated forms of production develop.…

  17. Distribution of Problems, Medications and Lab Results in Electronic Health Records: The Pareto Principle at Work.

    Science.gov (United States)

    Wright, Adam; Bates, David W

    2010-01-01

    BACKGROUND: Many natural phenomena demonstrate power-law distributions, where very common items predominate. Problems, medications and lab results represent some of the most important data elements in medicine, but their overall distribution has not been reported. OBJECTIVE: Our objective is to determine whether problems, medications and lab results demonstrate a power law distribution. METHODS: Retrospective review of electronic medical record data for 100,000 randomly selected patients seen at least twice in 2006 and 2007 at the Brigham and Women's Hospital in Boston and its affiliated medical practices. RESULTS: All three data types exhibited a power law distribution. The 12.5% most frequently used problems account for 80% of all patient problems, the top 11.8% of medications account for 80% of all medication orders and the top 4.5% of lab result types account for all lab results. CONCLUSION: These three data elements exhibited power law distributions with a small number of common items representing a substantial proportion of all orders and observations, which has implications for electronic health record design.

  18. Entropy maximization under the constraints on the generalized Gini index and its application in modeling income distributions

    Science.gov (United States)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2015-11-01

    In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.

  19. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    International Nuclear Information System (INIS)

    Chhaiba, Hassan; Demni, Nizar; Mouayn, Zouhair

    2016-01-01

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  20. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    Energy Technology Data Exchange (ETDEWEB)

    Chhaiba, Hassan, E-mail: chhaiba.hassan@gmail.com [Department of Mathematics, Faculty of Sciences, Ibn Tofail University, P.O. Box 133, Kénitra (Morocco); Demni, Nizar, E-mail: nizar.demni@univ-rennes1.fr [IRMAR, Université de Rennes 1, Campus de Beaulieu, 35042 Rennes Cedex (France); Mouayn, Zouhair, E-mail: mouayn@fstbm.ac.ma [Department of Mathematics, Faculty of Sciences and Technics (M’Ghila), Sultan Moulay Slimane, P.O. Box 523, Béni Mellal (Morocco)

    2016-07-15

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  1. Exploring the squeezed three-point galaxy correlation function with generalized halo occupation distribution models

    Science.gov (United States)

    Yuan, Sihan; Eisenstein, Daniel J.; Garrison, Lehman H.

    2018-04-01

    We present the GeneRalized ANd Differentiable Halo Occupation Distribution (GRAND-HOD) routine that generalizes the standard 5 parameter halo occupation distribution model (HOD) with various halo-scale physics and assembly bias. We describe the methodology of 4 different generalizations: satellite distribution generalization, velocity bias, closest approach distance generalization, and assembly bias. We showcase the signatures of these generalizations in the 2-point correlation function (2PCF) and the squeezed 3-point correlation function (squeezed 3PCF). We identify generalized HOD prescriptions that are nearly degenerate in the projected 2PCF and demonstrate that these degeneracies are broken in the redshift-space anisotropic 2PCF and the squeezed 3PCF. We also discuss the possibility of identifying degeneracies in the anisotropic 2PCF and further demonstrate the extra constraining power of the squeezed 3PCF on galaxy-halo connection models. We find that within our current HOD framework, the anisotropic 2PCF can predict the squeezed 3PCF better than its statistical error. This implies that a discordant squeezed 3PCF measurement could falsify the particular HOD model space. Alternatively, it is possible that further generalizations of the HOD model would open opportunities for the squeezed 3PCF to provide novel parameter measurements. The GRAND-HOD Python package is publicly available at https://github.com/SandyYuan/GRAND-HOD.

  2. Multi-objective optimization of cooling air distributions of grate cooler with different clinker particles diameters and air chambers by genetic algorithm

    International Nuclear Information System (INIS)

    Shao, Wei; Cui, Zheng; Cheng, Lin

    2017-01-01

    Highlights: • A multi-objective optimization model of air distributions of grate cooler by genetic algorithm is proposed. • Optimal air distributions of different conditions are obtained and validated by measurements. • The most economic average diameters of clinker particles is 0.02 m. • The most economic amount of air chambers is 9. - Abstract: The paper proposes a multi-objective optimization model of cooling air distributions of grate cooler in cement plant based on convective heat transfer principle and entropy generation minimization analysis. The heat transfer and flow models of clinker cooling process are brought out at first. Then the modified entropy generation numbers caused by heat transfer and viscous dissipation are considered as objective functions respectively which are optimized by genetic algorithm simultaneously. The design variables are superficial velocities of air chambers and thicknesses of clinker layer on different grate plates. The model is verified by a set of Pareto optimal solutions and scattered distributions of design variables. Sensitive analysis of average diameters of clinker particles and amount of air chambers are carried out based on the optimization model. The optimal cooling air distributions are compared by heat recovered, energy consumption of cooling fans and heat efficiency of grate cooler. And all of them are selected from the Pareto optimal solutions based on energy consumption of cooling fans minimization. The results show that the most effective and economic average diameter of clinker particles is 0.02 m and the amount of air chambers is 9.

  3. Applicability of Markets to Global Scheduling in Grids: Critical Examination of General Equilibrium Theory and Market Folklore

    Science.gov (United States)

    Nakai, Junko; VanDerWijngaart, Rob F.

    2003-01-01

    Markets are often considered superior to other global scheduling mechanisms for distributed computing systems. This claim is supported by: a casual observation from our every-day life that markets successfully equilibrate supply and demand, and the features of markets which originate in the general equilibrium theory, e.g., efficiency and the lack of necessity of 2 central controller. This paper describes why such beliefs in markets are not warranted. It does so by examining the general equilibrium theory, in terms of scope, abstraction, and interpretation. Not only does the general equilibrium theory fail to provide a satisfactory explanation of actual economies, including a computing-resource economy, it also falls short of supplying theoretical foundations for commonly held views of market desirability. This paper also points out that the argument for the desirability of markets involves circular reasoning and that the desirability can be established only vis-a-vis a scheduling goal. Finally, recasting the conclusion of Arrow's Impossibility Theorem as that for global scheduling, we conclude that there exists no market-based scheduler that is rational (in the sense defined in microeconomic theory), takes into account utility of more than one user, and yet yields a Pareto-optimal outcome for arbitrary user utility functions.

  4. Effect of a generalized particle momentum distribution on plasma nuclear fusion rates

    International Nuclear Information System (INIS)

    Kim, Yeong E.; Zubarev, Alexander L.

    2006-01-01

    We investigate the effect of a generalized particle momentum distribution derived by Galitskii and Yakimets (GY) on nuclear reaction rates in plasma. We derive an approximate semi-analytical formula for nuclear fusion reaction rate between nuclei in a plasma (quantum plasma nuclear fusion; or QPNF). The QPNF formula is applied to calculate deuteron-deuteron fusion rate in a plasma, and the results are compared with the results calculated with the conventional Maxwell-Boltzmann velocity distribution. As an application, we investigate the deuteron-deuteron fusion rate for mobile deuterons in a deuterated metal/alloy. The calculated deuteron-deuteron fusion rates at low energies are enormously enhanced due to the modified tail of the GY's generalized momentum distribution. Our preliminary estimates indicate also that the deuteron-lithium (D+Li) fusion rate and the proton-lithium (p+Li) fusion rate in a metal/alloy at ambient temperatures are also substantially enhanced. (author)

  5. Empirical Estimates in Stochastic Optimization via Distribution Tails

    Czech Academy of Sciences Publication Activity Database

    Kaňková, Vlasta

    2010-01-01

    Roč. 46, č. 3 (2010), s. 459-471 ISSN 0023-5954. [International Conference on Mathematical Methods in Economy and Industry. České Budějovice, 15.06.2009-18.06.2009] R&D Projects: GA ČR GA402/07/1113; GA ČR(CZ) GA402/08/0107; GA MŠk(CZ) LC06075 Institutional research plan: CEZ:AV0Z10750506 Keywords : Stochastic programming problems * Stability * Wasserstein metric * L_1 norm * Lipschitz property * Empirical estimates * Convergence rate * Exponential tails * Heavy tails * Pareto distribution * Risk functional * Empirical quantiles Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.461, year: 2010

  6. Helicity-dependent generalized parton distributions for nonzero skewness

    Energy Technology Data Exchange (ETDEWEB)

    Mondal, Chandan [Chinese Academy of Sciences, Institute of Modern Physics, Lanzhou (China)

    2017-09-15

    We investigate the helicity-dependent generalized parton distributions (GPDs) in momentum as well as transverse position (impact) spaces for the u and d quarks in a proton when the momentum transfer in both the transverse and the longitudinal directions are nonzero. The GPDs are evaluated using the light-front wave functions of a quark-diquark model for nucleon where the wave functions are constructed by the soft-wall AdS/QCD correspondence. We also express the GPDs in the boost-invariant longitudinal position space. (orig.)

  7. On the use of biomass size spectra linear adjustments to design ecosystem indicators

    Directory of Open Access Journals (Sweden)

    Paúl Gómez-Canchong

    2013-06-01

    Full Text Available Biomass size spectra describe the structure of aquatic communities ataxonomically. The slope (b of the normalized biomass size spectrum (NBSS is often used as an indicator of the impact of perturbations, such as pollution or overfishing. The NBSS intercept (a, has generally been ignored on the basis of a correlation between the NBSS slope and intercept, although this correlation has not been shown to be universal. We assessed whether the NBSS parameters are correlated using: (i theoretical analysis, (ii virtual communities randomly generated based only on statistical considerations, and (iii virtual food webs changing over time following a dynamic bioenergetic model. We also analyzed whether the parameters of the Pareto distribution are correlated or not, using approaches (i and (ii. We found that when communities change over time there is no single relationship between the two NBSS parameters, due to a dependence on the variation in total community abundance (N. We conclude that to characterize any aquatic system at least two parameters are necessary from the NBSS triad N, a, b. In the case of the Pareto distribution, both NPareto and bPareto are necessary.

  8. Author Details

    African Journals Online (AJOL)

    Necir, Abdelhakim. Vol 11, No 1 (2016) - Articles Statistical estimate of the proportional hazard premium of loss under random censoring. Abstract PDF · Vol 11, No 2 (2016) - Articles Robust bayesian inference of generalized Pareto distribution. Abstract PDF · Vol 12, No 1 (2017) - Articles A Lynden-Bell integral estimator for ...

  9. Testing the Pareto against the lognormal distributions with the uniformly most powerful unbiased test applied to the distribution of cities.

    Science.gov (United States)

    Malevergne, Yannick; Pisarenko, Vladilen; Sornette, Didier

    2011-03-01

    Fat-tail distributions of sizes abound in natural, physical, economic, and social systems. The lognormal and the power laws have historically competed for recognition with sometimes closely related generating processes and hard-to-distinguish tail properties. This state-of-affair is illustrated with the debate between Eeckhout [Amer. Econ. Rev. 94, 1429 (2004)] and Levy [Amer. Econ. Rev. 99, 1672 (2009)] on the validity of Zipf's law for US city sizes. By using a uniformly most powerful unbiased (UMPU) test between the lognormal and the power-laws, we show that conclusive results can be achieved to end this debate. We advocate the UMPU test as a systematic tool to address similar controversies in the literature of many disciplines involving power laws, scaling, "fat" or "heavy" tails. In order to demonstrate that our procedure works for data sets other than the US city size distribution, we also briefly present the results obtained for the power-law tail of the distribution of personal identity (ID) losses, which constitute one of the major emergent risks at the interface between cyberspace and reality.

  10. The Applicability of Confidence Intervals of Quantiles for the Generalized Logistic Distribution

    Science.gov (United States)

    Shin, H.; Heo, J.; Kim, T.; Jung, Y.

    2007-12-01

    The generalized logistic (GL) distribution has been widely used for frequency analysis. However, there is a little study related to the confidence intervals that indicate the prediction accuracy of distribution for the GL distribution. In this paper, the estimation of the confidence intervals of quantiles for the GL distribution is presented based on the method of moments (MOM), maximum likelihood (ML), and probability weighted moments (PWM) and the asymptotic variances of each quantile estimator are derived as functions of the sample sizes, return periods, and parameters. Monte Carlo simulation experiments are also performed to verify the applicability of the derived confidence intervals of quantile. As the results, the relative bias (RBIAS) and relative root mean square error (RRMSE) of the confidence intervals generally increase as return period increases and reverse as sample size increases. And PWM for estimating the confidence intervals performs better than the other methods in terms of RRMSE when the data is almost symmetric while ML shows the smallest RBIAS and RRMSE when the data is more skewed and sample size is moderately large. The GL model was applied to fit the distribution of annual maximum rainfall data. The results show that there are little differences in the estimated quantiles between ML and PWM while distinct differences in MOM.

  11. PAPR-Constrained Pareto-Optimal Waveform Design for OFDM-STAP Radar

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Satyabrata [ORNL

    2014-01-01

    We propose a peak-to-average power ratio (PAPR) constrained Pareto-optimal waveform design approach for an orthogonal frequency division multiplexing (OFDM) radar signal to detect a target using the space-time adaptive processing (STAP) technique. The use of an OFDM signal does not only increase the frequency diversity of our system, but also enables us to adaptively design the OFDM coefficients in order to further improve the system performance. First, we develop a parametric OFDM-STAP measurement model by considering the effects of signaldependent clutter and colored noise. Then, we observe that the resulting STAP-performance can be improved by maximizing the output signal-to-interference-plus-noise ratio (SINR) with respect to the signal parameters. However, in practical scenarios, the computation of output SINR depends on the estimated values of the spatial and temporal frequencies and target scattering responses. Therefore, we formulate a PAPR-constrained multi-objective optimization (MOO) problem to design the OFDM spectral parameters by simultaneously optimizing four objective functions: maximizing the output SINR, minimizing two separate Cramer-Rao bounds (CRBs) on the normalized spatial and temporal frequencies, and minimizing the trace of CRB matrix on the target scattering coefficients estimations. We present several numerical examples to demonstrate the achieved performance improvement due to the adaptive waveform design.

  12. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  13. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    Science.gov (United States)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  14. Universality of Generalized Parton Distributions in Light-Front Holographic QCD

    Science.gov (United States)

    de Téramond, Guy F.; Liu, Tianbo; Sufian, Raza Sabbir; Dosch, Hans Günter; Brodsky, Stanley J.; Deur, Alexandre; Hlfhs Collaboration

    2018-05-01

    The structure of generalized parton distributions is determined from light-front holographic QCD up to a universal reparametrization function w (x ) which incorporates Regge behavior at small x and inclusive counting rules at x →1 . A simple ansatz for w (x ) that fulfills these physics constraints with a single-parameter results in precise descriptions of both the nucleon and the pion quark distribution functions in comparison with global fits. The analytic structure of the amplitudes leads to a connection with the Veneziano model and hence to a nontrivial connection with Regge theory and the hadron spectrum.

  15. Estimating the parameters of a generalized lambda distribution

    International Nuclear Information System (INIS)

    Fournier, B.; Rupin, N.; Najjar, D.; Iost, A.; Rupin, N.; Bigerelle, M.; Wilcox, R.; Fournier, B.

    2007-01-01

    The method of moments is a popular technique for estimating the parameters of a generalized lambda distribution (GLD), but published results suggest that the percentile method gives superior results. However, the percentile method cannot be implemented in an automatic fashion, and automatic methods, like the starship method, can lead to prohibitive execution time with large sample sizes. A new estimation method is proposed that is automatic (it does not require the use of special tables or graphs), and it reduces the computational time. Based partly on the usual percentile method, this new method also requires choosing which quantile u to use when fitting a GLD to data. The choice for u is studied and it is found that the best choice depends on the final goal of the modeling process. The sampling distribution of the new estimator is studied and compared to the sampling distribution of estimators that have been proposed. Naturally, all estimators are biased and here it is found that the bias becomes negligible with sample sizes n ≥ 2 * 10(3). The.025 and.975 quantiles of the sampling distribution are investigated, and the difference between these quantiles is found to decrease proportionally to 1/root n.. The same results hold for the moment and percentile estimates. Finally, the influence of the sample size is studied when a normal distribution is modeled by a GLD. Both bounded and unbounded GLDs are used and the bounded GLD turns out to be the most accurate. Indeed it is shown that, up to n = 10(6), bounded GLD modeling cannot be rejected by usual goodness-of-fit tests. (authors)

  16. The class of L ∩ D and its application to renewal reward process

    Science.gov (United States)

    Kamışlık, Aslı Bektaş; Kesemen, Tülay; Khaniyev, Tahir

    2018-01-01

    The class of L ∩ D is generated by intersection of two important subclasses of heavy tailed distributions: The long tailed distributions and dominated varying distributions. This class itself is also an important member of heavy tailed distributions and has some principal application areas especially in renewal, renewal reward and random walk processes. The aim of this study is to observe some well and less known results on renewal functions generated by the class of L ∩ D and apply them into a special renewal reward process which is known in the literature a semi Markovian inventory model of type (s, S). Especially we focused on Pareto distribution which belongs to the L ∩ D subclass of heavy tailed distributions. As a first step we obtained asymptotic results for renewal function generated by Pareto distribution from the class of L ∩ D using some well-known results by Embrechts and Omey [1]. Then we applied the results we obtained for Pareto distribution to renewal reward processes. As an application we investigate inventory model of type (s, S) when demands have Pareto distribution from the class of L ∩ D. We obtained asymptotic expansion for ergodic distribution function and finally we reached asymptotic expansion for nth order moments of distribution of this process.

  17. The Incompatibility of Pareto Optimality and Dominant-Strategy Incentive Compatibility in Sufficiently-Anonymous Budget-Constrained Quasilinear Settings

    Directory of Open Access Journals (Sweden)

    Rica Gonen

    2013-11-01

    Full Text Available We analyze the space of deterministic, dominant-strategy incentive compatible, individually rational and Pareto optimal combinatorial auctions. We examine a model with multidimensional types, nonidentical items, private values and quasilinear preferences for the players with one relaxation; the players are subject to publicly-known budget constraints. We show that the space includes dictatorial mechanisms and that if dictatorial mechanisms are ruled out by a natural anonymity property, then an impossibility of design is revealed. The same impossibility naturally extends to other abstract mechanisms with an arbitrary outcome set if one maintains the original assumptions of players with quasilinear utilities, public budgets and nonnegative prices.

  18. On chiral-odd Generalized Parton Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Wallon, Samuel [Laboratoire de Physique Theorique d' Orsay - LPT, Bat. 210, Univ. Paris-Sud 11, 91405 Orsay Cedex (France); UPMC Univ. Paris 6, Paris (France); Pire, Bernard [Centre de Physique Theorique - CPHT, UMR 7644, Ecole Polytechnique, Bat. 6, RDC, F91128 Palaiseau Cedex (France); Szymanowski, Lech [Soltan Institute for Nuclear Studies, Hoza 69, 00691, Warsaw (Poland)

    2010-07-01

    The chiral-odd transversity generalized parton distributions of the nucleon can be accessed experimentally through the exclusive photoproduction process {gamma} + N {yields} {pi} + {rho} + N', in the kinematics where the meson pair has a large invariant mass and the final nucleon has a small transverse momentum, provided the vector meson is produced in a transversally polarized state. Estimated counting rates show that the experiment is feasible with real or quasi real photon beams expected at JLab at 12 GeV and in the COMPASS experiment. (Phys Letters B688,154,2010) In addition, a consistent classification of the chiral-odd pion GPDs beyond the leading twist 2 is presented. Based on QCD equations of motion and on the invariance under rotation on the light-cone of any scattering amplitude involving such GPDs, we reduce the basis of these chiral-odd GPDs to a minimal set. (author)

  19. Topology Identification of General Dynamical Network with Distributed Time Delays

    International Nuclear Information System (INIS)

    Zhao-Yan, Wu; Xin-Chu, Fu

    2009-01-01

    General dynamical networks with distributed time delays are studied. The topology of the networks are viewed as unknown parameters, which need to be identified. Some auxiliary systems (also called the network estimators) are designed to achieve this goal. Both linear feedback control and adaptive strategy are applied in designing these network estimators. Based on linear matrix inequalities and the Lyapunov function method, the sufficient condition for the achievement of topology identification is obtained. This method can also better monitor the switching topology of dynamical networks. Illustrative examples are provided to show the effectiveness of this method. (general)

  20. The utilization of copula in hidrology

    OpenAIRE

    Trandafir, Romica; Ciuiu, Daniel; Drobot, Radu

    2010-01-01

    In this paper the parameters of the generalized Pareto cumulative distribution functions of the marginals and the parameter θ of the connecting copula for the water maximum discharges and water volumes are obtained. The isolines for C(F(x),G(y)) =1−ε and for C∗ (F(x),G(y))=ε will be drawn.

  1. Author Details

    African Journals Online (AJOL)

    Mokrani, Fatiha. Vol 11, No 2 (2016) - Articles Robust bayesian inference of generalized Pareto distribution. Abstract PDF. ISSN: 2316-090X. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact ...

  2. Development of particle multiplicity distributions using a general form of the grand canonical partition function

    International Nuclear Information System (INIS)

    Lee, S.J.; Mekjian, A.Z.

    2004-01-01

    Various phenomenological models of particle multiplicity distributions are discussed using a general form of a unified model which is based on the grand canonical partition function and Feynman's path integral approach to statistical processes. These models can be written as special cases of a more general distribution which has three control parameters which are a,x,z. The relation to these parameters to various physical quantities are discussed. A connection of the parameter a with Fisher's critical exponent τ is developed. Using this grand canonical approach, moments, cumulants and combinants are discussed and a physical interpretation of the combinants are given and their behavior connected to the critical exponent τ. Various physical phenomena such as hierarchical structure, void scaling relations, Koba-Nielson-Olesen or KNO scaling features, clan variables, and branching laws are shown in terms of this general approach. Several of these features which were previously developed in terms of the negative binomial distribution are found to be more general. Both hierarchical structure and void scaling relations depend on the Fisher exponent τ. Applications of our approach to the charged particle multiplicity distribution in jets of L3 and H1 data are given

  3. Nucleon-generalized parton distributions in the light-front quark model

    Indian Academy of Sciences (India)

    2016-01-12

    Jan 12, 2016 ... 1. Introduction. Generalized parton distributions (GPDs) are the important set of parameters that give us ... The AdS/CFT is the correspondence between the string theory on a higher-dimensional anti-de Sitter ... matching the soft-wall model of AdS/QCD and light-front QCD for EFFs of hadrons with arbitrary ...

  4. Analysis of nonlocal neural fields for both general and gamma-distributed connectivities

    Science.gov (United States)

    Hutt, Axel; Atay, Fatihcan M.

    2005-04-01

    This work studies the stability of equilibria in spatially extended neuronal ensembles. We first derive the model equation from statistical properties of the neuron population. The obtained integro-differential equation includes synaptic and space-dependent transmission delay for both general and gamma-distributed synaptic connectivities. The latter connectivity type reveals infinite, finite, and vanishing self-connectivities. The work derives conditions for stationary and nonstationary instabilities for both kernel types. In addition, a nonlinear analysis for general kernels yields the order parameter equation of the Turing instability. To compare the results to findings for partial differential equations (PDEs), two typical PDE-types are derived from the examined model equation, namely the general reaction-diffusion equation and the Swift-Hohenberg equation. Hence, the discussed integro-differential equation generalizes these PDEs. In the case of the gamma-distributed kernels, the stability conditions are formulated in terms of the mean excitatory and inhibitory interaction ranges. As a novel finding, we obtain Turing instabilities in fields with local inhibition-lateral excitation, while wave instabilities occur in fields with local excitation and lateral inhibition. Numerical simulations support the analytical results.

  5. Introducing a rainfall compound distribution model based on weather patterns sub-sampling

    Directory of Open Access Journals (Sweden)

    F. Garavaglia

    2010-06-01

    Full Text Available This paper presents a probabilistic model for daily rainfall, using sub-sampling based on meteorological circulation. We classified eight typical but contrasted synoptic situations (weather patterns for France and surrounding areas, using a "bottom-up" approach, i.e. from the shape of the rain field to the synoptic situations described by geopotential fields. These weather patterns (WP provide a discriminating variable that is consistent with French climatology, and allows seasonal rainfall records to be split into more homogeneous sub-samples, in term of meteorological genesis.

    First results show how the combination of seasonal and WP sub-sampling strongly influences the identification of the asymptotic behaviour of rainfall probabilistic models. Furthermore, with this level of stratification, an asymptotic exponential behaviour of each sub-sample appears as a reasonable hypothesis. This first part is illustrated with two daily rainfall records from SE of France.

    The distribution of the multi-exponential weather patterns (MEWP is then defined as the composition, for a given season, of all WP sub-sample marginal distributions, weighted by the relative frequency of occurrence of each WP. This model is finally compared to Exponential and Generalized Pareto distributions, showing good features in terms of robustness and accuracy. These final statistical results are computed from a wide dataset of 478 rainfall chronicles spread on the southern half of France. All these data cover the 1953–2005 period.

  6. Sci-Thur AM: Planning - 04: Evaluation of the fluence complexity, solution quality, and run efficiency produced by five fluence parameterizations implemented in PARETO multiobjective radiotherapy treatment planning software.

    Science.gov (United States)

    Champion, H; Fiege, J; McCurdy, B; Potrebko, P; Cull, A

    2012-07-01

    PARETO (Pareto-Aware Radiotherapy Evolutionary Treatment Optimization) is a novel multiobjective treatment planning system that performs beam orientation and fluence optimization simultaneously using an advanced evolutionary algorithm. In order to reduce the number of parameters involved in this enormous search space, we present several methods for modeling the beam fluence. The parameterizations are compared using innovative tools that evaluate fluence complexity, solution quality, and run efficiency. A PARETO run is performed using the basic weight (BW), linear gradient (LG), cosine transform (CT), beam group (BG), and isodose-projection (IP) methods for applying fluence modulation over the projection of the Planning Target Volume in the beam's-eye-view plane. The solutions of each run are non-dominated with respect to other trial solutions encountered during the run. However, to compare the solution quality of independent runs, each run competes against every other run in a round robin fashion. Score is assigned based on the fraction of solutions that survive when a tournament selection operator is applied to the solutions of the two competitors. To compare fluence complexity, a modulation index, fractal dimension, and image gradient entropy are calculated for the fluence maps of each optimal plan. We have found that the LG method results in superior solution quality for a spine phantom, lung patient, and cauda equina patient. The BG method produces solutions with the highest degree of fluence complexity. Most methods result in comparable run times. The LG method produces superior solution quality using a moderate degree of fluence modulation. © 2012 American Association of Physicists in Medicine.

  7. Novel formulation of the ℳ model through the Generalized-K distribution for atmospheric optical channels.

    Science.gov (United States)

    Garrido-Balsells, José María; Jurado-Navas, Antonio; Paris, José Francisco; Castillo-Vazquez, Miguel; Puerta-Notario, Antonio

    2015-03-09

    In this paper, a novel and deeper physical interpretation on the recently published Málaga or ℳ statistical distribution is provided. This distribution, which is having a wide acceptance by the scientific community, models the optical irradiance scintillation induced by the atmospheric turbulence. Here, the analytical expressions previously published are modified in order to express them by a mixture of the known Generalized-K and discrete Binomial and Negative Binomial distributions. In particular, the probability density function (pdf) of the ℳ model is now obtained as a linear combination of these Generalized-K pdf, in which the coefficients depend directly on the parameters of the ℳ distribution. In this way, the Málaga model can be physically interpreted as a superposition of different optical sub-channels each of them described by the corresponding Generalized-K fading model and weighted by the ℳ dependent coefficients. The expressions here proposed are simpler than the equations of the original ℳ model and are validated by means of numerical simulations by generating ℳ -distributed random sequences and their associated histogram. This novel interpretation of the Málaga statistical distribution provides a valuable tool for analyzing the performance of atmospheric optical channels for every turbulence condition.

  8. Power Laws are Disguised Boltzmann Laws

    Science.gov (United States)

    Richmond, Peter; Solomon, Sorin

    Using a previously introduced model on generalized Lotka-Volterra dynamics together with some recent results for the solution of generalized Langevin equations, we derive analytically the equilibrium mean field solution for the probability distribution of wealth and show that it has two characteristic regimes. For large values of wealth, it takes the form of a Pareto style power law. For small values of wealth, wGeneralized Lotka-Volterra type of stochastic dynamics. The power law that arises in the distribution function is identified with new additional logarithmic terms in the familiar Boltzmann distribution function for the system. These are a direct consequence of the multiplicative stochastic dynamics and are absent for the usual additive stochastic processes.

  9. Multivariate data analysis as a semi-quantitative tool for interpretive evaluation of comparability or equivalence of aerodynamic particle size distribution profiles.

    Science.gov (United States)

    Shi, Shuai; Hickey, Anthony J

    2009-01-01

    The purpose of this article is to investigate the performance of multivariate data analysis, especially orthogonal partial least square (OPLS) analysis, as a semi-quantitative tool to evaluate the comparability or equivalence of aerodynamic particle size distribution (APSD) profiles of orally inhaled and nasal drug products (OINDP). Monte Carlo simulation was employed to reconstitute APSD profiles based on 55 realistic scenarios proposed by the Product Quality Research Institute (PQRI) working group. OPLS analyses with different data pretreatment methods were performed on each of the reconstituted profiles. Compared to unit-variance scaling, equivalence determined based on OPLS analysis with Pareto scaling was shown to be more consistent with the working group assessment. Chi-square statistics was employed to compare the performance of OPLS analysis (Pareto scaling) with that of the combination test (i.e., chi-square ratio statistics and population bioequivalence test for impactor-sized mass) in terms of achieving greater consistency with the working group evaluation. A p value of 0.036 suggested that OPLS analysis with Pareto scaling may be more predictive than the combination test with respect to consistency. Furthermore, OPLS analysis may also be employed to analyze part of the APSD profiles that contribute to the calculation of the mass median aerodynamic diameter. Our results show that OPLS analysis performed on partial deposition sites do not interfere with the performance on all deposition sites.

  10. Data envelopment analysis and Pareto genetic algorithm applied to robust design in multiresponse systems

    Directory of Open Access Journals (Sweden)

    Enrique Carlos Canessa-Terrazas

    2016-01-01

    Full Text Available Se presenta el uso de Análisis Envolvente de Datos (AED para priorizar y seleccionar soluciones encontradas por un Algoritmo Genético de Pareto (AGP a problemas de diseño robusto en sistemas multirespuesta con muchos factores de control y ruido. El análisis de eficiencia de las soluciones con AED muestra que el AGP encuentra una buena aproximación a la frontera eficiente. Además, se usa AED para determinar la combinación del nivel de ajuste de media y variación de las respuestas del sistema, y con la finalidad de minimizar el costo económico de alcanzar dichos objetivos. Al unir ese costo con otras consideraciones técnicas y/o económicas, la solución que mejor se ajuste con un nivel predeterminado de calidad puede ser seleccionada más apropiadamente.

  11. Moments of generalized Husimi distributions and complexity of many-body quantum states

    International Nuclear Information System (INIS)

    Sugita, Ayumu

    2003-01-01

    We consider generalized Husimi distributions for many-body systems, and show that their moments are good measures of complexity of many-body quantum states. Our construction of the Husimi distribution is based on the coherent state of the single-particle transformation group. Then the coherent states are independent-particle states, and, at the same time, the most localized states in the Husimi representation. Therefore delocalization of the Husimi distribution, which can be measured by the moments, is a sign of many-body correlation (entanglement). Since the delocalization of the Husimi distribution is also related to chaoticity of the dynamics, it suggests a relation between entanglement and chaos. Our definition of the Husimi distribution can be applied not only to systems of distinguishable particles, but also to those of identical particles, i.e., fermions and bosons. We derive an algebraic formula to evaluate the moments of the Husimi distribution

  12. [Method for optimal sensor placement in water distribution systems with nodal demand uncertainties].

    Science.gov (United States)

    Liu, Shu-Ming; Wu, Xue; Ouyang, Le-Yan

    2013-08-01

    The notion of identification fitness was proposed for optimizing sensor placement in water distribution systems. Nondominated Sorting Genetic Algorithm II was used to find the Pareto front between minimum overlap of possible detection times of two events and the best probability of detection, taking nodal demand uncertainties into account. This methodology was applied to an example network. The solutions show that the probability of detection and the number of possible locations are not remarkably affected by nodal demand uncertainties, but the sources identification accuracy declines with nodal demand uncertainties.

  13. Towards a General Theory of Extremes for Observables of Chaotic Dynamical Systems.

    Science.gov (United States)

    Lucarini, Valerio; Faranda, Davide; Wouters, Jeroen; Kuna, Tobias

    2014-01-01

    In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the chosen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan-Yorke dimension of the attractor. Preliminary numerical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

  14. Towards a General Theory of Extremes for Observables of Chaotic Dynamical Systems

    Science.gov (United States)

    Lucarini, Valerio; Faranda, Davide; Wouters, Jeroen; Kuna, Tobias

    2014-02-01

    In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the chosen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan-Yorke dimension of the attractor. Preliminary numerical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

  15. A general approach to double-moment normalization of drop size distributions

    NARCIS (Netherlands)

    Lee, G.W.; Zawadzki, I.; Szyrmer, W.; Sempere Torres, D.; Uijlenhoet, R.

    2004-01-01

    Normalization of drop size distributions (DSDs) is reexamined here. First, an extension of the scaling normalization that uses one moment of the DSD as a scaling parameter to a more general scaling normalization that uses two moments as scaling parameters of the normalization is presented. In

  16. TreePOD: Sensitivity-Aware Selection of Pareto-Optimal Decision Trees.

    Science.gov (United States)

    Muhlbacher, Thomas; Linhardt, Lorenz; Moller, Torsten; Piringer, Harald

    2018-01-01

    Balancing accuracy gains with other objectives such as interpretability is a key challenge when building decision trees. However, this process is difficult to automate because it involves know-how about the domain as well as the purpose of the model. This paper presents TreePOD, a new approach for sensitivity-aware model selection along trade-offs. TreePOD is based on exploring a large set of candidate trees generated by sampling the parameters of tree construction algorithms. Based on this set, visualizations of quantitative and qualitative tree aspects provide a comprehensive overview of possible tree characteristics. Along trade-offs between two objectives, TreePOD provides efficient selection guidance by focusing on Pareto-optimal tree candidates. TreePOD also conveys the sensitivities of tree characteristics on variations of selected parameters by extending the tree generation process with a full-factorial sampling. We demonstrate how TreePOD supports a variety of tasks involved in decision tree selection and describe its integration in a holistic workflow for building and selecting decision trees. For evaluation, we illustrate a case study for predicting critical power grid states, and we report qualitative feedback from domain experts in the energy sector. This feedback suggests that TreePOD enables users with and without statistical background a confident and efficient identification of suitable decision trees.

  17. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  18. 收费公路项目Pareto有效BOT合同与政府补贴%Pareto-efficient BOT contracts for road franchising with government subsidy

    Institute of Scientific and Technical Information of China (English)

    谭志加; 杨海; 陈琼

    2013-01-01

    Private-sector participation in road construction and operations has the advantages of efficiency gains, private financing, and better identification of attractive investment projects. Such participation is generally implemented through a build-operate-transfer (BOT) contract, under which a private firm builds and operates roads in a road network at its own expense, and in return receives the revenue from road tolls for a number of years, and then these roads are transferred to the government. In a BOT toll road project, the public and private sectors have different objectives: the former cares about the social welfare and the latter wants to make more money from the project. Based on the different objectives of the two sectors, this paper analyzes the Pareto efficiency of the capacity, toll and subsidy size by adopting a bi-objective mathematical programming problem. The definition of the Pareto-efficient BOT contract is introduced for the bi-objective programming problem, and its properties are also studied theoretically. This paper conducts a further study for the current research of BOT toll road schemes, which provides a practical guidance for the public sector.%根据BOT(建设-运营-移交)项目中公共部门和私人部门的不同目标,利用双目标规划模型研究了收费公路BOT项目合同容量、通行费费率及政府补贴政策的联合决策.引入Pareto有效BOT合同的概念,并从理论上研究了Pareto有效BOT合同的性质,建立了两个必要条件用以甄别BOT合同的Pareto有效性.进一步完善目前收费公路BOT项目合同的理论研究,对公共部门制定收费公路项目补贴政策具有现实指导意义.

  19. Design of a Circularly Polarized Galileo E6-Band Textile Antenna by Dedicated Multiobjective Constrained Pareto Optimization

    Directory of Open Access Journals (Sweden)

    Arnaut Dierck

    2015-01-01

    Full Text Available Designing textile antennas for real-life applications requires a design strategy that is able to produce antennas that are optimized over a wide bandwidth for often conflicting characteristics, such as impedance matching, axial ratio, efficiency, and gain, and, moreover, that is able to account for the variations that apply for the characteristics of the unconventional materials used in smart textile systems. In this paper, such a strategy, incorporating a multiobjective constrained Pareto optimization, is presented and applied to the design of a Galileo E6-band antenna with optimal return loss and wide-band axial ratio characteristics. Subsequently, different prototypes of the optimized antenna are fabricated and measured to validate the proposed design strategy.

  20. A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling.

    Science.gov (United States)

    Bouguila, Nizar; Ziou, Djemel

    2010-01-01

    In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.