WorldWideScience

Sample records for generalized pareto distribution

  1. The exponentiated generalized Pareto distribution | Adeyemi | Ife ...

    African Journals Online (AJOL)

    Recently Gupta et al. (1998) introduced the exponentiated exponential distribution as a generalization of the standard exponential distribution. In this paper, we introduce a three-parameter generalized Pareto distribution, the exponentiated generalized Pareto distribution (EGP). We present a comprehensive treatment of the ...

  2. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    Abstract. In this work, robust Bayesian estimation of the generalized Pareto distribution is proposed. The methodology is presented in terms of oscillation of posterior risks of the Bayesian estimators. By using a Monte Carlo simulation study, we show that, under a suitable generalized loss function, we can obtain a robust ...

  3. An Extended Pareto Distribution

    OpenAIRE

    Mohamad Mead

    2014-01-01

    For the first time, a new continuous distribution, called the generalized beta exponentiated Pareto type I (GBEP) [McDonald exponentiated Pareto] distribution, is defined and investigated. The new distribution contains as special sub-models some well-known and not known distributions, such as the generalized beta Pareto (GBP) [McDonald Pareto], the Kumaraswamy exponentiated Pareto (KEP), Kumaraswamy Pareto (KP), beta exponentiated Pareto (BEP), beta Pareto (BP), exponentiated Pareto (EP) and ...

  4. Estimation of the shape parameter of a generalized Pareto distribution based on a transformation to Pareto distributed variables

    OpenAIRE

    van Zyl, J. Martin

    2012-01-01

    Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...

  5. An Extended Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Mohamad Mead

    2014-10-01

    Full Text Available For the first time, a new continuous distribution, called the generalized beta exponentiated Pareto type I (GBEP [McDonald exponentiated Pareto] distribution, is defined and investigated. The new distribution contains as special sub-models some well-known and not known distributions, such as the generalized beta Pareto (GBP [McDonald Pareto], the Kumaraswamy exponentiated Pareto (KEP, Kumaraswamy Pareto (KP, beta exponentiated Pareto (BEP, beta Pareto (BP, exponentiated Pareto (EP and Pareto, among several others. Various structural properties of the new distribution are derived, including explicit expressions for the moments, moment generating function, incomplete moments, quantile function, mean deviations and Rényi entropy. Lorenz, Bonferroni and Zenga curves are derived. The method of maximum likelihood is proposed for estimating the model parameters. We obtain the observed information matrix. The usefulness of the new model is illustrated by means of two real data sets. We hope that this generalization may attract wider applications in reliability, biology and lifetime data analysis.

  6. Scaling of Precipitation Extremes Modelled by Generalized Pareto Distribution

    Science.gov (United States)

    Rajulapati, C. R.; Mujumdar, P. P.

    2017-12-01

    Precipitation extremes are often modelled with data from annual maximum series or peaks over threshold series. The Generalized Pareto Distribution (GPD) is commonly used to fit the peaks over threshold series. Scaling of precipitation extremes from larger time scales to smaller time scales when the extremes are modelled with the GPD is burdened with difficulties arising from varying thresholds for different durations. In this study, the scale invariance theory is used to develop a disaggregation model for precipitation extremes exceeding specified thresholds. A scaling relationship is developed for a range of thresholds obtained from a set of quantiles of non-zero precipitation of different durations. The GPD parameters and exceedance rate parameters are modelled by the Bayesian approach and the uncertainty in scaling exponent is quantified. A quantile based modification in the scaling relationship is proposed for obtaining the varying thresholds and exceedance rate parameters for shorter durations. The disaggregation model is applied to precipitation datasets of Berlin City, Germany and Bangalore City, India. From both the applications, it is observed that the uncertainty in the scaling exponent has a considerable effect on uncertainty in scaled parameters and return levels of shorter durations.

  7. Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder

    1992-01-01

    As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments......-weighted moments. Maintaining the generalized Pareto distribution as the parent exceedance distribution the T-year event is estimated assuming the exceedances to be exponentially distributed. For moderately long-tailed exceedance distributions and small to moderate sample sizes it is found, by comparing mean...... square errors of the T-year event estimators, that the exponential distribution is preferable to the correct generalized Pareto distribution despite the introduced model error and despite a possible rejection of the exponential hypothesis by a test of significance. For moderately short-tailed exceedance...

  8. A New Generalization of the Pareto Distribution and Its Application to Insurance Data

    Directory of Open Access Journals (Sweden)

    Mohamed E. Ghitany

    2018-02-01

    Full Text Available The Pareto classical distribution is one of the most attractive in statistics and particularly in the scenario of actuarial statistics and finance. For example, it is widely used when calculating reinsurance premiums. In the last years, many alternative distributions have been proposed to obtain better adjustments especially when the tail of the empirical distribution of the data is very long. In this work, an alternative generalization of the Pareto distribution is proposed and its properties are studied. Finally, application of the proposed model to the earthquake insurance data set is presented.

  9. GENERALIZED DOUBLE PARETO SHRINKAGE.

    Science.gov (United States)

    Armagan, Artin; Dunson, David B; Lee, Jaeyong

    2013-01-01

    We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t -like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.

  10. MATLAB implementation of satellite positioning error overbounding by generalized Pareto distribution

    Science.gov (United States)

    Ahmad, Khairol Amali; Ahmad, Shahril; Hashim, Fakroul Ridzuan

    2018-02-01

    In the satellite navigation community, error overbound has been implemented in the process of integrity monitoring. In this work, MATLAB programming is used to implement the overbounding of satellite positioning error CDF. Using a trajectory of reference, the horizontal position errors (HPE) are computed and its non-parametric distribution function is given by the empirical Cumulative Distribution Function (ECDF). According to the results, these errors have a heavy-tailed distribution. Sınce the ECDF of the HPE in urban environment is not Gaussian distributed, the ECDF is overbound with the CDF of the generalized Pareto distribution (GPD).

  11. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  12. Group Acceptance Sampling Plan for Lifetime Data Using Generalized Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2010-02-01

    Full Text Available In this paper, a group acceptance sampling plan (GASP is introduced for the situations when lifetime of the items follows the generalized Pareto distribution. The design parameters such as minimum group size and acceptance number are determined when the consumer’s risk and the test termination time are specified. The proposed sampling plan is compared with the existing sampling plan. It is concluded that the proposed sampling plan performs better than the existing plan in terms of minimum sample size required to reach the same decision.

  13. Rayleigh Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema ‎ Abed Al-Kadim

    2017-12-01

    Full Text Available In this paper Rayleigh Pareto distribution have  introduced denote by( R_PD. We stated some  useful functions. Therefor  we  give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters  so the aim of this search  is to introduce a new distribution

  14. Higher moments method for generalized Pareto distribution in flood frequency analysis

    Science.gov (United States)

    Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.

    2017-08-01

    The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.

  15. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    Science.gov (United States)

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  17. Critical review and hydrologic application of threshold detection methods for the generalized Pareto (GP) distribution

    Science.gov (United States)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto

    2016-04-01

    Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the

  18. On the Truncated Pareto Distribution with applications

    OpenAIRE

    Zaninetti, Lorenzo; Ferraro, Mario

    2008-01-01

    The Pareto probability distribution is widely applied in different fields such us finance, physics, hydrology, geology and astronomy. This note deals with an application of the Pareto distribution to astrophysics and more precisely to the statistical analysis of mass of stars and of diameters of asteroids. In particular a comparison between the usual Pareto distribution and its truncated version is presented. Finally a possible physical mechanism that produces Pareto tails for the distributio...

  19. Record Values of a Pareto Distribution.

    Science.gov (United States)

    Ahsanullah, M.

    The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…

  20. Uniform-Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema Abid Al Kadhim

    2017-11-01

    Full Text Available we introduce (uniform-Preato distribution U-PD, we discusses some of its properties, distribution, probability density, reliability function, hazard, reserved hazard functions, moments, mode median and its order statistics. Furthermore, the study estimates the shape parameter. We also introduce the simulation study about the estimation of the parameter and the survival function and the application using the data about "spina bifida" disease that the name of the most common birth defect in Babylon province.

  1. Coherent Multilook Radar Detection for Targets in Pareto Distributed Clutter

    Science.gov (United States)

    2012-01-01

    UNCLASSIFIED a generalised Pareto distribution. Consequently, CFAR processes are derived for targets in Pareto distributed clutter. However, the resultant...constructing Pareto CFAR detectors is outlined in [15], but the resultant threshold/false alarm probability relationship is not amenable to numerical methods...inverse gamma tex- ture1. It is shown that the CFAR property does not hold in general for coherent detection schemes considered. However, this is

  2. Seasonal and Non-Seasonal Generalized Pareto Distribution to Estimate Extreme Significant Wave Height in The Banda Sea

    Science.gov (United States)

    Nursamsiah; Nugroho Sugianto, Denny; Suprijanto, Jusup; Munasik; Yulianto, Bambang

    2018-02-01

    The information of extreme wave height return level was required for maritime planning and management. The recommendation methods in analyzing extreme wave were better distributed by Generalized Pareto Distribution (GPD). Seasonal variation was often considered in the extreme wave model. This research aims to identify the best model of GPD by considering a seasonal variation of the extreme wave. By using percentile 95 % as the threshold of extreme significant wave height, the seasonal GPD and non-seasonal GPD fitted. The Kolmogorov-Smirnov test was applied to identify the goodness of fit of the GPD model. The return value from seasonal and non-seasonal GPD was compared with the definition of return value as criteria. The Kolmogorov-Smirnov test result shows that GPD fits data very well both seasonal and non-seasonal model. The seasonal return value gives better information about the wave height characteristics.

  3. Pareto law and Pareto index in the income distribution of Japanese companies

    OpenAIRE

    Ishikawa, Atushi

    2004-01-01

    In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution chang...

  4. On the Pareto Type III distribution

    OpenAIRE

    Bottazzi, Giulio

    2009-01-01

    This short note analyzes the distributional properties of Pareto Type III random variables. We introduce a three parameters version of the orignal two parameters distribution proposed by Pareto and derive both the density and the characteristic function. The analytic expression of the inverse distribution function is also obtained, together with a simple series expansion of its moments of any order. Finally, we propose a simple statistical exercise designed to show the increased reliability o...

  5. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    Science.gov (United States)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General

  6. The exponential age distribution and the Pareto firm size distribution

    OpenAIRE

    Coad, Alex

    2008-01-01

    Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.

  7. Pareto law and Pareto index in the income distribution of Japanese companies

    Science.gov (United States)

    Ishikawa, Atushi

    2005-04-01

    In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution changes sharply, from a viewpoint of capital (or means). We also find a quantitative relation between the lower bound of capital and the typical scale at which Pareto law breaks. The larger the lower bound of capital becomes, the larger the typical scale becomes. From this result, the reason there is a (no) typical scale at which Pareto law breaks in the income distribution can be understood through (no) constraint, such as the lower bound of capital or means of companies, in the financial system.

  8. Word frequencies: A comparison of Pareto type distributions

    Science.gov (United States)

    Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng

    2018-03-01

    Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.

  9. Wild Fluctuations of Random Functions with the Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Ming Li

    2013-01-01

    Full Text Available This paper provides the fluctuation analysis of random functions with the Pareto distribution. By the introduced concept of wild fluctuations, we give an alternative way to classify the fluctuations from those with light-tailed distributions. Moreover, the suggested term wildest fluctuation may be used to classify random functions with infinite variance from those with finite variances.

  10. A Pareto upper tail for capital income distribution

    Science.gov (United States)

    Oancea, Bogdan; Pirjol, Dan; Andrei, Tudorel

    2018-02-01

    We present a study of the capital income distribution and of its contribution to the total income (capital income share) using individual tax income data in Romania, for 2013 and 2014. Using a parametric representation we show that the capital income is Pareto distributed in the upper tail, with a Pareto coefficient α ∼ 1 . 44 which is much smaller than the corresponding coefficient for wage- and non-wage-income (excluding capital income), of α ∼ 2 . 53. Including the capital income contribution has the effect of increasing the overall inequality measures.

  11. Pareto Distribution of Firm Size and Knowledge Spillover Process as a Network

    OpenAIRE

    Tomohiko Konno

    2013-01-01

    The firm size distribution is considered as Pareto distribution. In the present paper, we show that the Pareto distribution of firm size results from the spillover network model which was introduced in Konno (2010).

  12. The Burr X Pareto Distribution: Properties, Applications and VaR Estimation

    Directory of Open Access Journals (Sweden)

    Mustafa Ç. Korkmaz

    2017-12-01

    Full Text Available In this paper, a new three-parameter Pareto distribution is introduced and studied. We discuss various mathematical and statistical properties of the new model. Some estimation methods of the model parameters are performed. Moreover, the peaks-over-threshold method is used to estimate Value-at-Risk (VaR by means of the proposed distribution. We compare the distribution with a few other models to show its versatility in modelling data with heavy tails. VaR estimation with the Burr X Pareto distribution is presented using time series data, and the new model could be considered as an alternative VaR model against the generalized Pareto model for financial institutions.

  13. Tsallis-Pareto like distributions in hadron-hadron collisions

    International Nuclear Information System (INIS)

    Barnafoeldi, G G; Uermoessy, K; Biro, T S

    2011-01-01

    Non-extensive thermodynamics is a novel approach in high energy physics. In high-energy heavy-ion, and especially in proton-proton collisions we are far from a canonical thermal state, described by the Boltzmann-Gibbs statistic. In these reactions low and intermediate transverse momentum spectra are extremely well reproduced by the Tsallis-Pareto distribution, but the physical origin of Tsallis parameters is still an unsettled question. Here, we analyze whether Tsallis-Pareto energy distribution do overlap with hadron spectra at high-pT. We fitted data, measured in proton-proton (proton-antiproton) collisions in wide center of mass energy range from 200 GeV RHIC up to 7 TeV LHC energies. Furthermore, our test is extended to an investigation of a possible √s-dependence of the power in the Tsallis-Pareto distribution, motivated by QCD evolution equations. We found that Tsallis-Pareto distributions fit well high-pT data, in the wide center of mass energy range. Deviance from the fits appears at p T > 20-30 GeV/c, especially on CDF data. Introducing a pT-scaling ansatz, the fits at low and intermediate transverse momenta still remain good, and the deviations tend to disappear at the highest-pT data.

  14. [Origination of Pareto distribution in complex dynamic systems].

    Science.gov (United States)

    Chernavskiĭ, D S; Nikitin, A P; Chernavskaia, O D

    2008-01-01

    The Pareto distribution, whose probability density function can be approximated at sufficiently great chi as rho(chi) - chi(-alpha), where alpha > or = 2, is of crucial importance from both the theoretical and practical point of view. The main reason is its qualitative distinction from the normal (Gaussian) distribution. Namely, the probability of high deviations appears to be significantly higher. The conception of the universal applicability of the Gauss law remains to be widely distributed despite the lack of objective confirmation of this notion in a variety of application areas. The origin of the Pareto distribution in dynamic systems located in the gaussian noise field is considered. A simple one-dimensional model is discussed where the system response in a rather wide interval of the variable can be quite precisely approximated by this distribution.

  15. Social Free Energy of a Pareto-Like Resource Distribution

    Directory of Open Access Journals (Sweden)

    Vinko Zlatić

    2007-02-01

    Full Text Available For an organisation with a Pareto-like distribution of the relevant resources we determine the social free energy and related social quantities using thermodynamical formalism. Macroscopic dynamics of the organisation is linked with the changes in the attributed thermodynamical quantities through changes in resource distribution function. It is argued that quantities of thermodynamical origin form the optimised set of organisation’s state indicators, which is reliable expression of micro-dynamics.

  16. Income dynamics with a stationary double Pareto distribution.

    Science.gov (United States)

    Toda, Alexis Akira

    2011-04-01

    Once controlled for the trend, the distribution of personal income appears to be double Pareto, a distribution that obeys the power law exactly in both the upper and the lower tails. I propose a model of income dynamics with a stationary distribution that is consistent with this fact. Using US male wage data for 1970-1993, I estimate the power law exponent in two ways--(i) from each cross section, assuming that the distribution has converged to the stationary distribution, and (ii) from a panel directly estimating the parameters of the income dynamics model--and obtain the same value of 8.4.

  17. An EM Algorithm for Double-Pareto-Lognormal Generalized Linear Model Applied to Heavy-Tailed Insurance Claims

    Directory of Open Access Journals (Sweden)

    Enrique Calderín-Ojeda

    2017-11-01

    Full Text Available Generalized linear models might not be appropriate when the probability of extreme events is higher than that implied by the normal distribution. Extending the method for estimating the parameters of a double Pareto lognormal distribution (DPLN in Reed and Jorgensen (2004, we develop an EM algorithm for the heavy-tailed Double-Pareto-lognormal generalized linear model. The DPLN distribution is obtained as a mixture of a lognormal distribution with a double Pareto distribution. In this paper the associated generalized linear model has the location parameter equal to a linear predictor which is used to model insurance claim amounts for various data sets. The performance is compared with those of the generalized beta (of the second kind and lognorma distributions.

  18. Generalized Pareto optimum and semi-classical spinors

    Science.gov (United States)

    Rouleux, M.

    2018-02-01

    In 1971, S. Smale presented a generalization of Pareto optimum he called the critical Pareto set. The underlying motivation was to extend Morse theory to several functions, i.e. to find a Morse theory for m differentiable functions defined on a manifold M of dimension ℓ. We use this framework to take a 2 × 2 Hamiltonian ℋ = ℋ(p) ∈ 2 C ∞(T * R 2) to its normal form near a singular point of the Fresnel surface. Namely we say that ℋ has the Pareto property if it decomposes, locally, up to a conjugation with regular matrices, as ℋ(p) = u ‧(p)C(p)(u ‧(p))*, where u : R 2 → R 2 has singularities of codimension 1 or 2, and C(p) is a regular Hermitian matrix (“integrating factor”). In particular this applies in certain cases to the matrix Hamiltonian of Elasticity theory and its (relative) perturbations of order 3 in momentum at the origin.

  19. Wireless cellular networks with Pareto-distributed call holding times

    Science.gov (United States)

    Rodriguez-Dagnino, Ramon M.; Takagi, Hideaki

    2001-07-01

    Nowadays, there is a growing interest in providing internet to mobile users. For instance, NTT DoCoMo in Japan deploys an important mobile phone network with that offers the Internet service, named 'i-mode', to more than 17 million subscribers. Internet traffic measurements show that the session duration of Call Holding Time (CHT) has probability distributions with heavy-tails, which tells us that they depart significantly from the traffic statistics of traditional voice services. In this environment, it is particularly important to know the number of handovers during a call for a network designer to make an appropriate dimensioning of virtual circuits for a wireless cell. The handover traffic has a direct impact on the Quality of Service (QoS); e.g. the service disruption due to the handover failure may significantly degrade the specified QoS of time-constrained services. In this paper, we first study the random behavior of the number of handovers during a call, where we assume that the CHT are Pareto distributed (heavy-tail distribution), and the Cell Residence Times (CRT) are exponentially distributed. Our approach is based on renewal theory arguments. We present closed-form formulae for the probability mass function (pmf) of the number of handovers during a Pareto distributed CHT, and obtain the probability of call completion as well as handover rates. Most of the formulae are expressed in terms of the Whittaker's function. We compare the Pareto case with cases of $k(subscript Erlang and hyperexponential distributions for the CHT.

  20. Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions

    Science.gov (United States)

    Liu, C.; Charpentier, R.R.; Su, J.

    2011-01-01

    Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.

  1. Structure of Pareto Solutions of Generalized Polyhedral-Valued Vector Optimization Problems in Banach Spaces

    Directory of Open Access Journals (Sweden)

    Qinghai He

    2013-01-01

    Full Text Available In general Banach spaces, we consider a vector optimization problem (SVOP in which the objective is a set-valued mapping whose graph is the union of finitely many polyhedra or the union of finitely many generalized polyhedra. Dropping the compactness assumption, we establish some results on structure of the weak Pareto solution set, Pareto solution set, weak Pareto optimal value set, and Pareto optimal value set of (SVOP and on connectedness of Pareto solution set and Pareto optimal value set of (SVOP. In particular, we improved and generalize, Arrow, Barankin, and Blackwell’s classical results in Euclidean spaces and Zheng and Yang’s results in general Banach spaces.

  2. On the size distribution of cities: an economic interpretation of the Pareto coefficient.

    Science.gov (United States)

    Suh, S H

    1987-01-01

    "Both the hierarchy and the stochastic models of size distribution of cities are analyzed in order to explain the Pareto coefficient by economic variables. In hierarchy models, it is found that the rate of variation in the productivity of cities and that in the probability of emergence of cities can explain the Pareto coefficient. In stochastic models, the productivity of cities is found to explain the Pareto coefficient. New city-size distribution functions, in which the Pareto coefficient is decomposed by economic variables, are estimated." excerpt

  3. Origin of Pareto-like spatial distributions in ecosystems.

    Science.gov (United States)

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  4. Zipf's law and influential factors of the Pareto exponent of the city size distribution: Evidence from China

    OpenAIRE

    GAO Hongying; WU Kangping

    2007-01-01

    This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the P...

  5. Some properties of truncated lognormal and the pareto distributions mixture

    OpenAIRE

    Žuklijaitė, Viktorija

    2008-01-01

    Draudimo matematikoje modeliuojant žalas dažnai naudojami dviejų parametrų lognormalusis ir Pareto skirstiniai. Lognormalusis skirstinys taikomas mažoms žaloms su dideliu dažniu aprašyti, o Pareto – didelėms su mažu. Siekiant, kad skirstinys vienodai gerai aprašytu visų tipų žalas, sudaromas nupjauto lognormaliojo ir Pareto skirstinių mišinys su trimis laisvais parametrais. Šis darbas parašytas remiantis Kahadawala Cooray ir Malwane M. A. Ananda straipsniu "Modeling actuarial data with a comp...

  6. Sebaran Generalized Extreme Value (GEV dan Generalized Pareto (GP untuk Pendugaan Curah Hujan Ekstrim di Wilayah DKI Jakarta

    Directory of Open Access Journals (Sweden)

    Achi Rinaldi

    2016-06-01

    Full Text Available Extreme event such as extreme rainfall have been analyzed and most concern for the country all around the world. There are two common distribution for extreme value which are Generalized Extreme Value distribution and Generalized Pareto distribution. These two distribution have shown good performace to estimate the parameter of  extreme value. This research was aim to estimate parameter of extreme value using GEV distribution and GP distribution, and also to characterized effect of extreme event such as flood. The rainfall data was taken from BMKG for 5 location in DKI Jakarta. Both of distribution shown a good perfromance. The resut showed that Tanjung Priok station has biggest location parameter for GEV and also the biggest scale parameter for GP, that mean the biggest probability to take flood effect of the extreme rainfall.

  7. Use of the truncated shifted Pareto distribution in assessing size distribution of oil and gas fields

    Science.gov (United States)

    Houghton, J.C.

    1988-01-01

    The truncated shifted Pareto (TSP) distribution, a variant of the two-parameter Pareto distribution, in which one parameter is added to shift the distribution right and left and the right-hand side is truncated, is used to model size distributions of oil and gas fields for resource assessment. Assumptions about limits to the left-hand and right-hand side reduce the number of parameters to two. The TSP distribution has advantages over the more customary lognormal distribution because it has a simple analytic expression, allowing exact computation of several statistics of interest, has a "J-shape," and has more flexibility in the thickness of the right-hand tail. Oil field sizes from the Minnelusa play in the Powder River Basin, Wyoming and Montana, are used as a case study. Probability plotting procedures allow easy visualization of the fit and help the assessment. ?? 1988 International Association for Mathematical Geology.

  8. Estimating extreme dry-spell risk in the Middle Ebro valley (Northeastern Spain). a comparative analysis of partial duration series with a General Pareto distribution and annual maxima series with a Gumbel distribution

    NARCIS (Netherlands)

    Vicente-Serrano, S.; Beguería, S.

    2003-01-01

    This paper analyses fifty-year time series of daily precipitation in a region of the middle Ebro valley (northern Spain) in order to predict extreme dry-spell risk. A comparison of observed and estimated maximum dry spells (50-year return period) showed that the Generalised Pareto (GP)

  9. Multiobjective Aerodynamic Shape Optimization Using Pareto Differential Evolution and Generalized Response Surface Metamodels

    Science.gov (United States)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. The DE algorithm has been recently extended to multiobjective optimization problem by using a Pareto-based approach. In this paper, a Pareto DE algorithm is applied to multiobjective aerodynamic shape optimization problems that are characterized by computationally expensive objective function evaluations. To improve computational expensive the algorithm is coupled with generalized response surface meta-models based on artificial neural networks. Results are presented for some test optimization problems from the literature to demonstrate the capabilities of the method.

  10. Improved Shape Parameter Estimation in Pareto Distributed Clutter with Neural Networks

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2016-12-01

    Full Text Available The main problem faced by naval radars is the elimination of the clutter input which is a distortion signal appearing mixed with target reflections. Recently, the Pareto distribution has been related to sea clutter measurements suggesting that it may provide a better fit than other traditional distributions. The authors propose a new method for estimating the Pareto shape parameter based on artificial neural networks. The solution achieves a precise estimation of the parameter, having a low computational cost, and outperforming the classic method which uses Maximum Likelihood Estimates (MLE. The presented scheme contributes to the development of the NATE detector for Pareto clutter, which uses the knowledge of clutter statistics for improving the stability of the detection, among other applications.

  11. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus

    2013-01-01

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained. (paper)

  12. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning.

    Science.gov (United States)

    Bokrantz, Rasmus

    2013-06-07

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained.

  13. Computing the Moments of Order Statistics from Truncated Pareto Distributions Based on the Conditional Expectation

    Directory of Open Access Journals (Sweden)

    Gökhan Gökdere

    2014-05-01

    Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.

  14. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Science.gov (United States)

    Bertrand, Sophie; Joo, Rocío; Fablet, Ronan

    2015-01-01

    How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern-oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities.

  15. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms’ Movements

    Science.gov (United States)

    Bertrand, Sophie; Joo, Rocío; Fablet, Ronan

    2015-01-01

    How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern–oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities. PMID:26172045

  16. Modeling air quality in main cities of Peninsular Malaysia by using a generalized Pareto model.

    Science.gov (United States)

    Masseran, Nurulkamal; Razali, Ahmad Mahir; Ibrahim, Kamarulzaman; Latif, Mohd Talib

    2016-01-01

    The air pollution index (API) is an important figure used for measuring the quality of air in the environment. The API is determined based on the highest average value of individual indices for all the variables which include sulfur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3), and suspended particulate matter (PM10) at a particular hour. API values that exceed the limit of 100 units indicate an unhealthy status for the exposed environment. This study investigates the risk of occurrences of API values greater than 100 units for eight urban areas in Peninsular Malaysia for the period of January 2004 to December 2014. An extreme value model, known as the generalized Pareto distribution (GPD), has been fitted to the API values found. Based on the fitted model, return period for describing the occurrences of API exceeding 100 in the different cities has been computed as the indicator of risk. The results obtained indicated that most of the urban areas considered have a very small risk of occurrence of the unhealthy events, except for Kuala Lumpur, Malacca, and Klang. However, among these three cities, it is found that Klang has the highest risk. Based on all the results obtained, the air quality standard in urban areas of Peninsular Malaysia falls within healthy limits to human beings.

  17. A hybrid pareto mixture for conditional asymmetric fat-tailed distributions.

    Science.gov (United States)

    Carreau, Julie; Bengio, Yoshua

    2009-07-01

    In many cases, we observe some variables X that contain predictive information over a scalar variable of interest Y , with (X,Y) pairs observed in a training set. We can take advantage of this information to estimate the conditional density p(Y|X = x). In this paper, we propose a conditional mixture model with hybrid Pareto components to estimate p(Y|X = x). The hybrid Pareto is a Gaussian whose upper tail has been replaced by a generalized Pareto tail. A third parameter, in addition to the location and spread parameters of the Gaussian, controls the heaviness of the upper tail. Using the hybrid Pareto in a mixture model results in a nonparametric estimator that can adapt to multimodality, asymmetry, and heavy tails. A conditional density estimator is built by modeling the parameters of the mixture estimator as functions of X. We use a neural network to implement these functions. Such conditional density estimators have important applications in many domains such as finance and insurance. We show experimentally that this novel approach better models the conditional density in terms of likelihood, compared to competing algorithms: conditional mixture models with other types of components and a classical kernel-based nonparametric model.

  18. A Note on Parameter Estimation in the Composite Weibull–Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Enrique Calderín-Ojeda

    2018-02-01

    Full Text Available Composite models have received much attention in the recent actuarial literature to describe heavy-tailed insurance loss data. One of the models that presents a good performance to describe this kind of data is the composite Weibull–Pareto (CWL distribution. On this note, this distribution is revisited to carry out estimation of parameters via mle and mle2 optimization functions in R. The results are compared with those obtained in a previous paper by using the nlm function, in terms of analytical and graphical methods of model selection. In addition, the consistency of the parameter estimation is examined via a simulation study.

  19. Computing the Distribution of Pareto Sums Using Laplace Transformation and Stehfest Inversion

    Science.gov (United States)

    Harris, C. K.; Bourne, S. J.

    2017-05-01

    In statistical seismology, the properties of distributions of total seismic moment are important for constraining seismological models, such as the strain partitioning model (Bourne et al. J Geophys Res Solid Earth 119(12): 8991-9015, 2014). This work was motivated by the need to develop appropriate seismological models for the Groningen gas field in the northeastern Netherlands, in order to address the issue of production-induced seismicity. The total seismic moment is the sum of the moments of individual seismic events, which in common with many other natural processes, are governed by Pareto or "power law" distributions. The maximum possible moment for an induced seismic event can be constrained by geomechanical considerations, but rather poorly, and for Groningen it cannot be reliably inferred from the frequency distribution of moment magnitude pertaining to the catalogue of observed events. In such cases it is usual to work with the simplest form of the Pareto distribution without an upper bound, and we follow the same approach here. In the case of seismicity, the exponent β appearing in the power-law relation is small enough for the variance of the unbounded Pareto distribution to be infinite, which renders standard statistical methods concerning sums of statistical variables, based on the central limit theorem, inapplicable. Determinations of the properties of sums of moderate to large numbers of Pareto-distributed variables with infinite variance have traditionally been addressed using intensive Monte Carlo simulations. This paper presents a novel method for accurate determination of the properties of such sums that is accurate, fast and easily implemented, and is applicable to Pareto-distributed variables for which the power-law exponent β lies within the interval [0, 1]. It is based on shifting the original variables so that a non-zero density is obtained exclusively for non-negative values of the parameter and is identically zero elsewhere, a property

  20. Random phenotypic variation of yeast (Saccharomyces cerevisiae single-gene knockouts fits a double pareto-lognormal distribution.

    Directory of Open Access Journals (Sweden)

    John H Graham

    Full Text Available Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat upper tail. The double Pareto-lognormal (DPLN distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails.If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN, left Pareto-lognormal (LPLN, normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC.Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions.A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the

  1. Random phenotypic variation of yeast (Saccharomyces cerevisiae) single-gene knockouts fits a double pareto-lognormal distribution.

    Science.gov (United States)

    Graham, John H; Robb, Daniel T; Poe, Amy R

    2012-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of

  2. Pareto oggi

    OpenAIRE

    Busino, Giovanni

    2013-01-01

    Scrittore aspro, disordinato, uomo altezzoso, sprezzante, polemista terribile, Pareto usa ed abusa dell’ironia scanzonata e soprattutto del sarcasmo per volgere in ridicolo quel che non gli aggrada. La sua opera solleva, certo, passioni, collere, ostilità, curiosità sbigottite, ma non è affatto ignorata, anzi ha intrigato, negli ultimi lustri, persino scrittori della levatura di Gadda, Noventa, Orelli, Pontiggia. In generale però, le letture di quest’opera vanno dalla stroncatura all’apologia...

  3. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    Science.gov (United States)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  4. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  5. Entropies of negative incomes, Pareto-distributed loss, and financial crises.

    Science.gov (United States)

    Gao, Jianbo; Hu, Jing; Mao, Xiang; Zhou, Mi; Gurbaxani, Brian; Lin, Johnny

    2011-01-01

    Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.

  6. Lettere di Vilfredo Pareto all’amico Roberto Michels: confini e confine nel Trattato di Sociologia Generale del 1916

    Directory of Open Access Journals (Sweden)

    Raffaele Federici

    2017-08-01

    Full Text Available In questa ricerca di senso fra la fine di un'epoca e la nuova visione del mondo, c’è, nei due Autori, quello che potrebbe chiamarsi una betweenness: Pareto, quasi un franco-italiano, e Michels, un italiano-tedesco, anzi un più che italiano. Nella linea di faglia rappresentata dal primo conflitto mondiale, i due sociologi sono in una doppia relazione interiore appunto franco-italiana Pareto e italo-tedesca Michels e una relazione esteriore fra il mondo di ieri e il mondo successivo al cataclisma che fu la prima guerra mondiale, quando ben quattro imperi colossali erano stati smembrati (l’Impero Russo, l’Impero Tedesco, l’Impero Austro-ungarico e l’Impero ottomano, nello stesso tempo in cui Emile Durkheim guardava con inquietudine alla disgregazione delle vecchie comunità tradizionali, dove il senso della crisi del tempo investe non solo le persone e i comportamenti, ma il mondo logico stesso. Lo scambio epistolare avviene nella stessa terra: Pareto a Celigny, sul lago di Ginevra , e Michels a Basilea , lungo le rive del Reno. Vi è, fra i due sociologi un profondo rispetto, che vedrà Robert Michels dedicare allo “scienziato e amico Vilfredo Pareto con venerazione” un’opera importante come “Problemi di sociologia applicata” pubblicata solo tre anni dopo il Trattato di Sociologia Generale del Maestro. In questa antologia di saggi Robert Michels, probabilmente composti fra il 1914 e il 1917, negli anni del grande cataclisma, anzi concepiti prima «dell’insediamento di questa terribile corte suprema di cassazione di tutte le nostre ideologie, che è la guerra» , quindi contemporanea al Trattato, il Maestro viene citato tre volte, come Max Weber, ma, de facto, la presenza di Pareto è continua. In particolare, il richiamo al Maestro è iscritto a due piste di ricerca: da una parte la realtà della ricerca sociologica e del suo amplissimo spettro di analisi e dall’altra la teoria della circolazione delle elités. È proprio

  7. Pareto printsiip

    Index Scriptorium Estoniae

    2011-01-01

    Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm

  8. Pareto utility

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  9. The gerber-shiu discounted penalty function for pareto distributed claims

    OpenAIRE

    Asanavičiūtė, Rasa

    2006-01-01

    Darbe gauta Gerber-Shiu diskontuotos baudos funkcijos asimptotika, kai žalos pasiskirsčiusios pagal Pareto dėsnį ir pradinis kapitalas x artėja į begalybę. Pagrindinė išraiška Gerber-Shiu diskontuotos baudos funkcijos išskaidyta į du atvejus, kai palūkanų norma nelygi ir lygi nuliui. Darbe pateikti grafikai rodo diskontuotos baudos funkcijos priklausomybę nuo įvairių Puasono modelio parametrų. The asymptotic of the Gerber-Shiu discounted penalty function in Poisson model with Pareto distri...

  10. Agent-Based Modelling of the Evolution of the Russian Party System Based on Pareto and Hotelling Distributions. Part II

    Directory of Open Access Journals (Sweden)

    Владимир Геннадьевич Иванов

    2015-12-01

    Full Text Available The given article presents research of the evolution of the Russian party system. The chosen methodology is based on the heuristic potential of agent-based modelling. The author analyzes various scenarios of parties’ competition (applying Pareto distribution in connection with recent increase of the number of political parties. In addition, the author predicts the level of ideological diversity of the parties’ platforms (applying the principles of Hotelling distribution in order to evaluate their potential competitiveness in the struggle for voters.

  11. Il problema della costruzione di senso nel Trattato di Sociologia Generale di Vilfredo Pareto

    Directory of Open Access Journals (Sweden)

    Andrea Millefiorini

    2017-08-01

    Full Text Available Pareto ci spiega come i residui siano al centro del complesso ordine sociale che viene a costituirsi dalla combinazione tra questi, gli interessi, l’eterogeneità della società e le derivazioni. Ai fini della costruzione di senso, vi è un genere di residui, quello definito «bisogno di sviluppi logici», il quale comprende «la maggior parte dei residui che determinano le derivazioni» . Sono poi queste ultime che, venendo diciamo così a “vivere di vita propria”, perimetrano, definiscono, determinano, conferiscono i significati individuali e collettivi sui quali l’interazione quotidiana tra gli uomini fonda la trama principale delle proprie routines, delle proprie pratiche, delle proprie condotte all’interno di ambiti di convivenza, di istituzioni, di comunità nazionali. Vi è stato chi, come Norberto Bobbio, ha tratto da questo indubbio assetto concettuale nella teoria sociologica paretiana, conseguenze e deduzioni che ci restituiscono il pensiero di Pareto come una versione socio-psicologica della teoria marxista della “falsa coscienza”. In sostanza, scrive Bobbio, «alla concezione storicistica delle ideologie propria di Marx, Pareto contrappone una concezione naturalistica dell’uomo come animale ideologico». Tuttavia bisogna intendersi. È certamente vero che le ideologie del Novecento possono essere spiegate seguendo l’approccio paretiano, ma la sua sociologia non si risolve e non si esaurisce in una semplice teoria delle ideologie. Essa è un qualcosa di ben più ampio e ben più complesso, che abbraccia tutto l’arco storico delle civiltà umane, e che quindi si pone come uno dei tentativi più ambiziosi, sino ad oggi concepiti dalle scienze sociali, di spiegare quel complicatissimo processo sociale che va sotto il nome di “costruzione di senso”.

  12. Active learning of Pareto fronts.

    Science.gov (United States)

    Campigotto, Paolo; Passerini, Andrea; Battiti, Roberto

    2014-03-01

    This paper introduces the active learning of Pareto fronts (ALP) algorithm, a novel approach to recover the Pareto front of a multiobjective optimization problem. ALP casts the identification of the Pareto front into a supervised machine learning task. This approach enables an analytical model of the Pareto front to be built. The computational effort in generating the supervised information is reduced by an active learning strategy. In particular, the model is learned from a set of informative training objective vectors. The training objective vectors are approximated Pareto-optimal vectors obtained by solving different scalarized problem instances. The experimental results show that ALP achieves an accurate Pareto front approximation with a lower computational effort than state-of-the-art estimation of distribution algorithms and widely known genetic techniques.

  13. Bayesian Estimation of Inequality and Poverty Indices in Case of Pareto Distribution Using Different Priors under LINEX Loss Function

    Directory of Open Access Journals (Sweden)

    Kamaljit Kaur

    2015-01-01

    Full Text Available Bayesian estimators of Gini index and a Poverty measure are obtained in case of Pareto distribution under censored and complete setup. The said estimators are obtained using two noninformative priors, namely, uniform prior and Jeffreys’ prior, and one conjugate prior under the assumption of Linear Exponential (LINEX loss function. Using simulation techniques, the relative efficiency of proposed estimators using different priors and loss functions is obtained. The performances of the proposed estimators have been compared on the basis of their simulated risks obtained under LINEX loss function.

  14. Coordinated Voltage Control in Distribution Network with the Presence of DGs and Variable Loads Using Pareto and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    José Raúl Castro

    2016-02-01

    Full Text Available This paper presents an efficient algorithm to solve the multi-objective (MO voltage control problem in distribution networks. The proposed algorithm minimizes the following three objectives: voltage variation on pilot buses, reactive power production ratio deviation, and generator voltage deviation. This work leverages two optimization techniques: fuzzy logic to find the optimum value of the reactive power of the distributed generation (DG and Pareto optimization to find the optimal value of the pilot bus voltage so that this produces lower losses under the constraints that the voltage remains within established limits. Variable loads and DGs are taken into account in this paper. The algorithm is tested on an IEEE 13-node test feeder and the results show the effectiveness of the proposed model.

  15. Market Ecology, Pareto Wealth Distribution and Leptokurtic Returns in Microscopic Simulation of the LLS Stock Market Model

    Science.gov (United States)

    Solomon, Sorin; Levy, Moshe

    2001-06-01

    The LLS stock market model (see Levy Levy and Solomon Academic Press 2000 "Microscopic Simulation of Financial Markets; From Investor Behavior to Market Phenomena" for a review) is a model of heterogeneous quasi-rational investors operating in a complex environment about which they have incomplete information. We review the main features of this model and several of its extensions. We study the effects of investor heterogeneity and show that predation, competition, or symbiosis may occur between different investor populations. The dynamics of the LLS model lead to the empirically observed Pareto wealth distribution. Many properties observed in actual markets appear as natural consequences of the LLS dynamics: - truncated Levy distribution of short-term returns, - excess volatility, - a return autocorrelation "U-shape" pattern, and - a positive correlation between volume and absolute returns.

  16. Diphoton generalized distribution amplitudes

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.

    2008-01-01

    We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.

  17. Studies on generalized kinetic model and Pareto optimization of a product-driven self-cycling bioprocess.

    Science.gov (United States)

    Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan

    2014-10-01

    The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.

  18. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  19. The Impact Crater Size-Frequency Distribution on Pluto Follows a Truncated Pareto Distribution: Results from a First Data Set Based on the Recent New Horizons' Flyby

    Directory of Open Access Journals (Sweden)

    Zaninetti L.

    2016-01-01

    Full Text Available Recently it could be shown (Scholkmann, Prog. in Phys. , 2016, v. 12(1, 26-29 that the impact crater size-frequency distribution of Pluto (based on an analysis of first images obtained by the recent New Horizons’ flyby follows a power law (α =2.4926±0.3309 in the interval of diameter ( D values ranging from 3.75±1.14 km to the largest deter- mined value of 37.77 km. A reanalysis of this data set revealed that the whole crater SFD (i.e., with values in the interval of 1.2–37.7 km can be described by a truncated Pareto distribution.

  20. Distribution of Problems, Medications and Lab Results in Electronic Health Records: The Pareto Principle at Work.

    Science.gov (United States)

    Wright, Adam; Bates, David W

    2010-01-01

    BACKGROUND: Many natural phenomena demonstrate power-law distributions, where very common items predominate. Problems, medications and lab results represent some of the most important data elements in medicine, but their overall distribution has not been reported. OBJECTIVE: Our objective is to determine whether problems, medications and lab results demonstrate a power law distribution. METHODS: Retrospective review of electronic medical record data for 100,000 randomly selected patients seen at least twice in 2006 and 2007 at the Brigham and Women's Hospital in Boston and its affiliated medical practices. RESULTS: All three data types exhibited a power law distribution. The 12.5% most frequently used problems account for 80% of all patient problems, the top 11.8% of medications account for 80% of all medication orders and the top 4.5% of lab result types account for all lab results. CONCLUSION: These three data elements exhibited power law distributions with a small number of common items representing a substantial proportion of all orders and observations, which has implications for electronic health record design.

  1. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1general bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  2. Testing the Pareto against the lognormal distributions with the uniformly most powerful unbiased test applied to the distribution of cities.

    Science.gov (United States)

    Malevergne, Yannick; Pisarenko, Vladilen; Sornette, Didier

    2011-03-01

    Fat-tail distributions of sizes abound in natural, physical, economic, and social systems. The lognormal and the power laws have historically competed for recognition with sometimes closely related generating processes and hard-to-distinguish tail properties. This state-of-affair is illustrated with the debate between Eeckhout [Amer. Econ. Rev. 94, 1429 (2004)] and Levy [Amer. Econ. Rev. 99, 1672 (2009)] on the validity of Zipf's law for US city sizes. By using a uniformly most powerful unbiased (UMPU) test between the lognormal and the power-laws, we show that conclusive results can be achieved to end this debate. We advocate the UMPU test as a systematic tool to address similar controversies in the literature of many disciplines involving power laws, scaling, "fat" or "heavy" tails. In order to demonstrate that our procedure works for data sets other than the US city size distribution, we also briefly present the results obtained for the power-law tail of the distribution of personal identity (ID) losses, which constitute one of the major emergent risks at the interface between cyberspace and reality.

  3. A generalization of the power law distribution with nonlinear exponent

    Science.gov (United States)

    Prieto, Faustino; Sarabia, José María

    2017-01-01

    The power law distribution is usually used to fit data in the upper tail of the distribution. However, commonly it is not valid to model data in all the range. In this paper, we present a new family of distributions, the so-called Generalized Power Law (GPL), which can be useful for modeling data in all the range and possess power law tails. To do that, we model the exponent of the power law using a non-linear function which depends on data and two parameters. Then, we provide some basic properties and some specific models of that new family of distributions. After that, we study a relevant model of the family, with special emphasis on the quantile and hazard functions, and the corresponding estimation and testing methods. Finally, as an empirical evidence, we study how the debt is distributed across municipalities in Spain. We check that power law model is only valid in the upper tail; we show analytically and graphically the competence of the new model with municipal debt data in the whole range; and we compare the new distribution with other well-known distributions including the Lognormal, the Generalized Pareto, the Fisk, the Burr type XII and the Dagum models.

  4. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  5. Uncertainties of the 50-year wind from short time series using generalized extreme value distribution and generalized Pareto distribution

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, Jakob; Rathmann, Ole

    2015-01-01

    as a guideline for applying GEVD and GPD to wind time series of limited length. The data analysis shows that, with reasonable choice of relevant parameters, GEVD and GPD give consistent estimates of the return winds. For GEVD, the base period should be chosen in accordance with the occurrence of the extreme wind...... events of the same mechanism. For GPD, the choices of the threshold, the definition of independent samples and the shape factor are interrelated. It is demonstrated that the lack of climatological representativity is a major source of uncertainty to the use of both GEVD and GPD; the information...

  6. A Pareto scale-inflated outlier model and its Bayesian analysis

    OpenAIRE

    Scollnik, David P. M.

    2016-01-01

    This paper develops a Pareto scale-inflated outlier model. This model is intended for use when data from some standard Pareto distribution of interest is suspected to have been contaminated with a relatively small number of outliers from a Pareto distribution with the same shape parameter but with an inflated scale parameter. The Bayesian analysis of this Pareto scale-inflated outlier model is considered and its implementation using the Gibbs sampler is discussed. The paper contains three wor...

  7. Minimizing Harmonic Distortion Impact at Distribution System with Considering Large-Scale EV Load Behaviour Using Modified Lightning Search Algorithm and Pareto-Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    S. N. Syed Nasir

    2018-01-01

    Full Text Available This research is focusing on optimal placement and sizing of multiple variable passive filter (VPF to mitigate harmonic distortion due to charging station (CS at 449 bus distribution network. There are 132 units of CS which are scheduled based on user behaviour within 24 hours, with the interval of 15 minutes. By considering the varying of CS patterns and harmonic impact, Modified Lightning Search Algorithm (MLSA is used to find 22 units of VPF coordination, so that less harmonics will be injected from 415 V bus to the medium voltage network and power loss is also reduced. Power system harmonic flow, VPF, CS, battery, and the analysis will be modelled in MATLAB/m-file platform. High Performance Computing (HPC is used to make simulation faster. Pareto-Fuzzy technique is used to obtain sizing of VPF from all nondominated solutions. From the result, the optimal placements and sizes of VPF are able to reduce the maximum THD for voltage and current and also the total apparent losses up to 39.14%, 52.5%, and 2.96%, respectively. Therefore, it can be concluded that the MLSA is suitable method to mitigate harmonic and it is beneficial in minimizing the impact of aggressive CS installation at distribution network.

  8. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    Science.gov (United States)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  9. A heavy-traffic theorem for the GI/G/1 queue with a Pareto-type service time distribution

    NARCIS (Netherlands)

    J.W. Cohen

    1997-01-01

    textabstractFor the $GI/G/1$-queueing model with traffic load $a<1$, service time distribution $B(t)$ and interarrival time distribution $A(t)$ holds, whenever for $t rightarrow infty$: $$ quad 1-B(t) sim frac{c{(t/ beta)^nu + {rm O ( {rm e^{-delta t ), quad c>0, quad 1< nu < 2, quad delta >

  10. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    Energy Technology Data Exchange (ETDEWEB)

    Agterberg, Frits, E-mail: agterber@nrcan.gc.ca [Geological Survey of Canada (Canada)

    2017-07-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  11. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    International Nuclear Information System (INIS)

    Agterberg, Frits

    2017-01-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  12. Comprehensive preference optimization of an irreversible thermal engine using pareto based mutable smart bee algorithm and generalized regression neural network

    DEFF Research Database (Denmark)

    Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar

    2013-01-01

    Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... performance of the proposed method. In order to find the maximum exploration potentials, these techniques are equipped with an external archive. These archives aid the methods to record all of the non-dominated solutions. Eventually, the proposed method and generalized regression neural network (GRNN......) are simultaneously used to optimize the major parameters of an irreversible thermal engine. In order to direct the PBMSB to explore deliberate spaces within the solution domain, a reference point obtained from finite time thermodynamic (FTT) approach, is utilized in the optimization. The outcome results show...

  13. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  14. Axiomatizations of Pareto Equilibria in Multicriteria Games

    NARCIS (Netherlands)

    Voorneveld, M.; Vermeulen, D.; Borm, P.E.M.

    1997-01-01

    We focus on axiomatizations of the Pareto equilibrium concept in multicriteria games based on consistency.Axiomatizations of the Nash equilibrium concept by Peleg and Tijs (1996) and Peleg, Potters, and Tijs (1996) have immediate generalizations.The axiomatization of Norde et al.(1996) cannot be

  15. Pareto-optimal alloys

    DEFF Research Database (Denmark)

    Bligaard, Thomas; Johannesson, Gisli Holmar; Ruban, Andrei

    2003-01-01

    Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties and the ......, the Pareto-optimal set, to determine optimal alloy solutions for the compromise between low compressibility, high stability, and cost....

  16. Kullback-Leibler divergence and the Pareto-Exponential approximation.

    Science.gov (United States)

    Weinberg, G V

    2016-01-01

    Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.

  17. Identification of Climate Change with Generalized Extreme Value (GEV) Distribution Approach

    International Nuclear Information System (INIS)

    Rahayu, Anita

    2013-01-01

    Some events are difficult to avoid and gives considerable influence to humans and the environment is extreme weather and climate change. Many of the problems that require knowledge about the behavior of extreme values and one of the methods used are the Extreme Value Theory (EVT). EVT used to draw up reliable systems in a variety of conditions, so as to minimize the risk of a major disaster. There are two methods for identifying extreme value, Block Maxima with Generalized Extreme Value (GEV) distribution approach and Peaks over Threshold (POT) with Generalized Pareto Distribution (GPD) approach. This research in Indramayu with January 1961-December 2003 period, the method used is Block Maxima with GEV distribution approach. The result showed that there is no climate change in Indramayu with January 1961-December 2003 period.

  18. Multi-choice stochastic transportation problem involving general form of distributions.

    Science.gov (United States)

    Quddoos, Abdul; Ull Hasan, Md Gulzar; Khalid, Mohammad Masood

    2014-01-01

    Many authors have presented studies of multi-choice stochastic transportation problem (MCSTP) where availability and demand parameters follow a particular probability distribution (such as exponential, weibull, cauchy or extreme value). In this paper an MCSTP is considered where availability and demand parameters follow general form of distribution and a generalized equivalent deterministic model (GMCSTP) of MCSTP is obtained. It is also shown that all previous models obtained by different authors can be deduced with the help of GMCSTP. MCSTP with pareto, power function or burr-XII distributions are also considered and equivalent deterministic models are obtained. To illustrate the proposed model two numerical examples are presented and solved using LINGO 13.0 software package.

  19. Pareto optimality in organelle energy metabolism analysis.

    Science.gov (United States)

    Angione, Claudio; Carapezza, Giovanni; Costanza, Jole; Lió, Pietro; Nicosia, Giuseppe

    2013-01-01

    In low and high eukaryotes, energy is collected or transformed in compartments, the organelles. The rich variety of size, characteristics, and density of the organelles makes it difficult to build a general picture. In this paper, we make use of the Pareto-front analysis to investigate the optimization of energy metabolism in mitochondria and chloroplasts. Using the Pareto optimality principle, we compare models of organelle metabolism on the basis of single- and multiobjective optimization, approximation techniques (the Bayesian Automatic Relevance Determination), robustness, and pathway sensitivity analysis. Finally, we report the first analysis of the metabolic model for the hydrogenosome of Trichomonas vaginalis, which is found in several protozoan parasites. Our analysis has shown the importance of the Pareto optimality for such comparison and for insights into the evolution of the metabolism from cytoplasmic to organelle bound, involving a model order reduction. We report that Pareto fronts represent an asymptotic analysis useful to describe the metabolism of an organism aimed at maximizing concurrently two or more metabolite concentrations.

  20. Approximating convex Pareto surfaces in multiobjective radiotherapy planning

    International Nuclear Information System (INIS)

    Craft, David L.; Halabi, Tarek F.; Shih, Helen A.; Bortfeld, Thomas R.

    2006-01-01

    Radiotherapy planning involves inherent tradeoffs: the primary mission, to treat the tumor with a high, uniform dose, is in conflict with normal tissue sparing. We seek to understand these tradeoffs on a case-to-case basis, by computing for each patient a database of Pareto optimal plans. A treatment plan is Pareto optimal if there does not exist another plan which is better in every measurable dimension. The set of all such plans is called the Pareto optimal surface. This article presents an algorithm for computing well distributed points on the (convex) Pareto optimal surface of a multiobjective programming problem. The algorithm is applied to intensity-modulated radiation therapy inverse planning problems, and results of a prostate case and a skull base case are presented, in three and four dimensions, investigating tradeoffs between tumor coverage and critical organ sparing

  1. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  2. An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index

    DEFF Research Database (Denmark)

    Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle

    2013-01-01

    We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...

  3. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Peng Zuoxiang

    2010-01-01

    Full Text Available Let be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function as , where represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  4. Pareto joint inversion of 2D magnetotelluric and gravity data

    Science.gov (United States)

    Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek

    2015-04-01

    In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where

  5. Pareto optimal pairwise sequence alignment.

    Science.gov (United States)

    DeRonne, Kevin W; Karypis, George

    2013-01-01

    Sequence alignment using evolutionary profiles is a commonly employed tool when investigating a protein. Many profile-profile scoring functions have been developed for use in such alignments, but there has not yet been a comprehensive study of Pareto optimal pairwise alignments for combining multiple such functions. We show that the problem of generating Pareto optimal pairwise alignments has an optimal substructure property, and develop an efficient algorithm for generating Pareto optimal frontiers of pairwise alignments. All possible sets of two, three, and four profile scoring functions are used from a pool of 11 functions and applied to 588 pairs of proteins in the ce_ref data set. The performance of the best objective combinations on ce_ref is also evaluated on an independent set of 913 protein pairs extracted from the BAliBASE RV11 data set. Our dynamic-programming-based heuristic approach produces approximated Pareto optimal frontiers of pairwise alignments that contain comparable alignments to those on the exact frontier, but on average in less than 1/58th the time in the case of four objectives. Our results show that the Pareto frontiers contain alignments whose quality is better than the alignments obtained by single objectives. However, the task of identifying a single high-quality alignment among those in the Pareto frontier remains challenging.

  6. The application of analytical methods to the study of Pareto - optimal control systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2014-01-01

    Full Text Available The subject of research articles - - methods of multicriteria optimization and their application for parametric synthesis of double-circuit control systems in conditions of inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of a multi-criteria choice - the principle of the Edgeworth - Pareto. Getting Pareto - optimal variants due to inconsistency of individual criteria does not mean reaching a final decision. Set these options only offers the designer (DM.An important issue when using traditional numerical methods is their computational cost. An example is the use of methods of sounding the parameter space, including with use of uniform grids and uniformly distributed sequences. Very complex computational task is the application of computer methods of approximation bounds of Pareto.The purpose of this work is the development of a fairly simple search methods of Pareto - optimal solutions for the case of the criteria set out in the analytical form.The proposed solution is based on the study of the properties of the analytical dependences of criteria. The case is not covered so far in the literature, namely, the topology of the task, in which no touch of indifference curves (lines level. It is shown that for such tasks may be earmarked for compromise solutions. Prepositional use of the angular position of antigradient to the indifference curves in the parameter space relative to the coordinate axes. Formulated propositions on the characteristics of comonotonicity and contramonotonicity and angular characteristics of antigradient to determine Pareto optimal solutions. Considers the General algorithm of calculation: determine the scope of permissible values of parameters; investigates properties comonotonicity and contraventanas; to build an equal level (indifference curves; determined touch type: single sided (task is not strictly multicriteria or bilateral (objective relates to the Pareto

  7. Unraveling hadron structure with generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Andrei Belitsky; Anatoly Radyushkin

    2004-10-01

    The recently introduced generalized parton distributions have emerged as a universal tool to describe hadrons in terms of quark and gluonic degrees of freedom. They combine the features of form factors, parton densities and distribution amplitudes - the functions used for a long time in studies of hadronic structure. Generalized parton distributions are analogous to the phase-space Wigner quasi-probability function of non-relativistic quantum mechanics which encodes full information on a quantum-mechanical system. We give an extensive review of main achievements in the development of this formalism. We discuss physical interpretation and basic properties of generalized parton distributions, their modeling and QCD evolution in the leading and next-to-leading orders. We describe how these functions enter a wide class of exclusive reactions, such as electro- and photo-production of photons, lepton pairs, or mesons.

  8. Transmuted New Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Shuaib Khan

    2017-06-01

    Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.

  9. Determining the distribution of fitness effects using a generalized Beta-Burr distribution.

    Science.gov (United States)

    Joyce, Paul; Abdo, Zaid

    2017-07-12

    In Beisel et al. (2007), a likelihood framework, based on extreme value theory (EVT), was developed for determining the distribution of fitness effects for adaptive mutations. In this paper we extend this framework beyond the extreme distributions and develop a likelihood framework for testing whether or not extreme value theory applies. By making two simple adjustments to the Generalized Pareto Distribution (GPD) we introduce a new simple five parameter probability density function that incorporates nearly every common (continuous) probability model ever used. This means that all of the common models are nested. This has important implications in model selection beyond determining the distribution of fitness effects. However, we demonstrate the use of this distribution utilizing likelihood ratio testing to evaluate alternative distributions to the Gumbel and Weibull domains of attraction of fitness effects. We use a bootstrap strategy, utilizing importance sampling, to determine where in the parameter space will the test be most powerful in detecting deviations from these domains and at what sample size, with focus on small sample sizes (n<20). Our results indicate that the likelihood ratio test is most powerful in detecting deviation from the Gumbel domain when the shape parameters of the model are small while the test is more powerful in detecting deviations from the Weibull domain when these parameters are large. As expected, an increase in sample size improves the power of the test. This improvement is observed to occur quickly with sample size n≥10 in tests related to the Gumbel domain and n≥15 in the case of the Weibull domain. This manuscript is in tribute to the contributions of Dr. Paul Joyce to the areas of Population Genetics, Probability Theory and Mathematical Statistics. A Tribute section is provided at the end that includes Paul's original writing in the first iterations of this manuscript. The Introduction and Alternatives to the GPD sections

  10. TOPICS IN THEORY OF GENERALIZED PARTON DISTRIBUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Radyushkin, Anatoly V. [JLAB, Old Dominion U.

    2013-05-01

    Several topics in the theory of generalized parton distributions (GPDs) are reviewed. First, we give a brief overview of the basics of the theory of generalized parton distributions and their relationship with simpler phenomenological functions, viz. form factors, parton densities and distribution amplitudes. Then, we discuss recent developments in building models for GPDs that are based on the formalism of double distributions (DDs). A special attention is given to a careful analysis of the singularity structure of DDs. The DD formalism is applied to construction of a model GPDs with a singular Regge behavior. Within the developed DD-based approach, we discuss the structure of GPD sum rules. It is shown that separation of DDs into the so-called ``plus'' part and the $D$-term part may be treated as a renormalization procedure for the GPD sum rules. This approach is compared with an alternative prescription based on analytic regularization.

  11. An introduction to the Generalized Parton Distributions

    International Nuclear Information System (INIS)

    Michel Garcon

    2002-01-01

    The concepts of Generalized Parton Distributions (GPD) are reviewed in an introductory and phenomenological fashion. These distributions provide a rich and unifying picture of the nucleon structure. Their physical meaning is discussed. The GPD are in principle measurable through exclusive deeply virtual production of photons (DVCS) or of mesons (DVMP). Experiments are starting to test the validity of these concepts. First results are discussed and new experimental projects presented, with an emphasis on this program at Jefferson Lab

  12. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  13. The generalized double Lomax distribution with applications

    Directory of Open Access Journals (Sweden)

    Abu Seif Mohammad Fares

    2016-12-01

    Full Text Available A new probability distribution from the polynomial family has been proposed for modeling heavy-tailed data that are continuous on the whole real line. we have derived some general properties of this distribution and applied it on several data sets of U.S stock market daily returns. The introduced model is symmetric and leptokurtic, it outperforms the peer distributions used for the given data from perspective of information criteria suggesting a new potential candidate for modeling data exhibiting heavy tails.

  14. Fitting statistical distributions the generalized lambda distribution and generalized bootstrap methods

    CERN Document Server

    Karian, Zaven A

    2000-01-01

    Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...

  15. Designing Pareto-superior demand-response rate options

    International Nuclear Information System (INIS)

    Horowitz, I.; Woo, C.K.

    2006-01-01

    We explore three voluntary service options-real-time pricing, time-of-use pricing, and curtailable/interruptible service-that a local distribution company might offer its customers in order to encourage them to alter their electricity usage in response to changes in the electricity-spot-market price. These options are simple and practical, and make minimal information demands. We show that each of the options is Pareto-superior ex ante, in that it benefits both the participants and the company offering it, while not affecting the non-participants. The options are shown to be Pareto-superior ex post as well, except under certain exceptional circumstances. (author)

  16. COMPROMISE, OPTIMAL AND TRACTIONAL ACCOUNTS ON PARETO SET

    Directory of Open Access Journals (Sweden)

    V. V. Lahuta

    2010-11-01

    Full Text Available The problem of optimum traction calculations is considered as a problem about optimum distribution of a resource. The dynamic programming solution is based on a step-by-step calculation of set of points of Pareto-optimum values of a criterion function (energy expenses and a resource (time.

  17. Improving Patient Schedules by Multi-agent Pareto Appointment Exchanging

    NARCIS (Netherlands)

    I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2006-01-01

    textabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem: the multi-agent Pareto-improvement appointment exchanging algorithm, MPAEX. It respects the decentralization of scheduling authorities and is capable of continuously adjusting the different patient

  18. Multi-agent Pareto appointment exchanging in hospital patient scheduling

    NARCIS (Netherlands)

    I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2007-01-01

    htmlabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment

  19. Hard exclusive reactions and generalized parton distributions

    Directory of Open Access Journals (Sweden)

    Hayrapetyan Avetik

    2015-01-01

    Full Text Available The recently developed formalism of Generalized Parton Distributions (GPDs allows connecting the experimental information of hard exclusive reactions to the spin contribution and even to the angular momentum contribution of quarks in the nucleon. By selecting different quantum numbers of the final state in exclusive productions, different GPDs can be addressed separately. The HERMES experiment at the HERA ring at DESY (Hamburg made pioneering contributions and first constraints to Generalized Parton Distributions (GPDs, using hard exclusive vector meson production (EVMP and Deeply Virtual Compton Scattering (DVCS. Using a novel recoil detector, HERMES managed to measure DVCS and EVMP free of any significant background. Selected results are highlighted and discussed in this paper.

  20. The PARETO RATING Software System for the Paretoapproximation Quality Assessment in Multi-criteria Optimization Problem

    Directory of Open Access Journals (Sweden)

    S. V. Groshev

    2014-01-01

    Full Text Available We consider the task to assess the quality of Pareto set (front numerical approximation in a multi-criteria optimization (MOC problem. We mean that Pareto-approximation is obtained by means of this or that population e.g. genetic algorithm.Eventually, the purpose of work is a comparative assessment of the efficiency of population algorithms of Pareto-approximation. The great number of characteristics (indicators of the Pareto-approximation quality is developed. Therefore an assessment problem of the Paretoapproximation quality is also considered as multi-criteria (multi-indicator. There are a number of well-known software systems to solve an assessment problem of the Pareto-approximation quality in different degree. Common drawback of these systems is a lack of both the WEB INTERFACE and the support of a multi-indicator assessment of Pareto-approximation quality (though there is a support to calculate the values of a large number of these indicators. The PARETO RATING software system is urged to eliminate the specified shortcomings of known systems. As population algorithms of Pareto-approximation are, as a rule, stochastic, we consider statistical methods to assess the quality of two and more Pareto-approximations (and thereby the estimates of algorithms used to obtain these approximations as well as follows: methods based on the ranging of the specified approximations; methods based on the quality indicators; methods based on the so-called empirical functions of approachability. We give formal statement of the MOC-problem and general scheme of the population algorithms of its solution, present reviews of known indicators of Pareto-approximation quality and statistical methods for assessment of Pareto-approximation quality. We describe the system architecture and main features of its software implementation and illustrate efficiency of made algorithmic and software solutions.

  1. A general setting for symmetric distributions and their relationship to general distributions

    OpenAIRE

    Jupp, P.E.; Regoli, G.; Azzalini, A.

    2016-01-01

    A standard method of obtaining non-symmetrical distributions is that of modulating symmetrical distributions by multiplying the densities by a perturbation factor. This has been considered mainly for central symmetry of a Euclidean space in the origin. This paper enlarges the concept of modulation to the general setting of symmetry under the action of a compact topological group on the sample space. The main structural result relates the density of an arbitrary distribution to the density of ...

  2. Pareto optimization in algebraic dynamic programming.

    Science.gov (United States)

    Saule, Cédric; Giegerich, Robert

    2015-01-01

    Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.

  3. New model for nucleon generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Radyushkin, Anatoly V. [JLAB, Newport News, VA (United States)

    2014-01-01

    We describe a new type of models for nucleon generalized parton distributions (GPDs) H and E. They are heavily based on the fact nucleon GPDs require to use two forms of double distribution (DD) representations. The outcome of the new treatment is that the usual DD+D-term construction should be amended by an extra term, {xi} E{sub +}{sup 1} (x,{xi}) which has the DD structure {alpha}/{beta} e({beta},{alpha}, with e({beta},{alpha}) being the DD that generates GPD E(x,{xi}). We found that this function, unlike the D-term, has support in the whole -1 <= x <= 1 region. Furthermore, it does not vanish at the border points |x|={xi}.

  4. Generalized Parton Distributions and their Singularities

    Energy Technology Data Exchange (ETDEWEB)

    Anatoly Radyushkin

    2011-04-01

    A new approach to building models of generalized parton distributions (GPDs) is discussed that is based on the factorized DD (double distribution) Ansatz within the single-DD formalism. The latter was not used before, because reconstructing GPDs from the forward limit one should start in this case with a very singular function $f(\\beta)/\\beta$ rather than with the usual parton density $f(\\beta)$. This results in a non-integrable singularity at $\\beta=0$ exaggerated by the fact that $f(\\beta)$'s, on their own, have a singular $\\beta^{-a}$ Regge behavior for small $\\beta$. It is shown that the singularity is regulated within the GPD model of Szczepaniak et al., in which the Regge behavior is implanted through a subtracted dispersion relation for the hadron-parton scattering amplitude. It is demonstrated that using proper softening of the quark-hadron vertices in the regions of large parton virtualities results in model GPDs $H(x,\\xi)$ that are finite and continuous at the "border point'' $x=\\xi$. Using a simple input forward distribution, we illustrate the implementation of the new approach for explicit construction of model GPDs. As a further development, a more general method of regulating the $\\beta=0$ singularities is proposed that is based on the separation of the initial single DD $f(\\beta, \\alpha)$ into the "plus'' part $[f(\\beta,\\alpha)]_{+}$ and the $D$-term. It is demonstrated that the "DD+D'' separation method allows to (re)derive GPD sum rules that relate the difference between the forward distribution $f(x)=H(x,0)$ and the border function $H(x,x)$ with the $D$-term function $D(\\alpha)$.

  5. Existence of pareto equilibria for multiobjective games without compactness

    OpenAIRE

    Shiraishi, Yuya; Kuroiwa, Daishi

    2013-01-01

    In this paper, we investigate the existence of Pareto and weak Pareto equilibria for multiobjective games without compactness. By employing an existence theorem of Pareto equilibria due to Yu and Yuan([10]), several existence theorems of Pareto and weak Pareto equilibria for the multiobjective games are established in a similar way to Flores-B´azan.

  6. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Bao Tao

    2010-01-01

    Full Text Available Let {Xn,n≥1} be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function F(x=1−x−1/γlF(x as γ>0, where lF(x represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  7. On Generalized Type 1 Logistic Distribution | Ahsanullah | Afrika ...

    African Journals Online (AJOL)

    Some distributional properties of the generalized type 1 logistic distribution are given. Based on these distributional property a characterization of this distribution is presented. Key words: Conditional Expectation; Reversed Hazard Rate; Characterization.

  8. On chiral-odd Generalized Parton Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Wallon, Samuel [Laboratoire de Physique Theorique d' Orsay - LPT, Bat. 210, Univ. Paris-Sud 11, 91405 Orsay Cedex (France); UPMC Univ. Paris 6, Paris (France); Pire, Bernard [Centre de Physique Theorique - CPHT, UMR 7644, Ecole Polytechnique, Bat. 6, RDC, F91128 Palaiseau Cedex (France); Szymanowski, Lech [Soltan Institute for Nuclear Studies, Hoza 69, 00691, Warsaw (Poland)

    2010-07-01

    The chiral-odd transversity generalized parton distributions of the nucleon can be accessed experimentally through the exclusive photoproduction process {gamma} + N {yields} {pi} + {rho} + N', in the kinematics where the meson pair has a large invariant mass and the final nucleon has a small transverse momentum, provided the vector meson is produced in a transversally polarized state. Estimated counting rates show that the experiment is feasible with real or quasi real photon beams expected at JLab at 12 GeV and in the COMPASS experiment. (Phys Letters B688,154,2010) In addition, a consistent classification of the chiral-odd pion GPDs beyond the leading twist 2 is presented. Based on QCD equations of motion and on the invariance under rotation on the light-cone of any scattering amplitude involving such GPDs, we reduce the basis of these chiral-odd GPDs to a minimal set. (author)

  9. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  10. Tractable Pareto Optimization of Temporal Preferences

    Science.gov (United States)

    Morris, Robert; Morris, Paul; Khatib, Lina; Venable, Brent

    2003-01-01

    This paper focuses on temporal constraint problems where the objective is to optimize a set of local preferences for when events occur. In previous work, a subclass of these problems has been formalized as a generalization of Temporal CSPs, and a tractable strategy for optimization has been proposed, where global optimality is defined as maximizing the minimum of the component preference values. This criterion for optimality, which we call 'Weakest Link Optimization' (WLO), is known to have limited practical usefulness because solutions are compared only on the basis of their worst value; thus, there is no requirement to improve the other values. To address this limitation, we introduce a new algorithm that re-applies WLO iteratively in a way that leads to improvement of all the values. We show the value of this strategy by proving that, with suitable preference functions, the resulting solutions are Pareto Optimal.

  11. Generalized Parton Distributions in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Kivel, Nikolai; Polyakov, Maxim; Vladimirov, Aleksey [Ruhr Universitaet, Bochum (Germany)

    2009-07-01

    We used {chi}PT approach to study the small-t behavior of the Generalized Parton Distributions (GPDs). We demonstrate that in the region of Bjorken x{sub Bj}{proportional_to}m{sub {pi}}{sup 2}/(4{pi}F{sub {pi}}){sup 2} and/or x{sub Bj}{proportional_to} vertical stroke t vertical stroke /(4{pi}F{sub {pi}}){sup 2} the standard {chi}PT for the pion GPDs is not sufficient and one must perform all order resummation of {chi}PT. We develop the technique in order to sum the problematic contributions with the leading logarithmic accuracy. We apply this approach for the pion GPDs and compute their behavior at the region of small-x{sub Bj}. Explicit resummation allows us to reveal novel phenomena - the form of the leading chiral correction to pion PDFs and GPDs depends on the small x asymptotic of the pion PDFs. In particular, if the pion PDF in the chiral limit has the Regge-like small x behaviour q(x){proportional_to}1/x{sup {omega}}, the leading large impact parameter (b {sub perpendicular} {sub to} {yields}{infinity}) asymptotics of the quark distribution in the transverse plane has the form (m{sub {pi}}=0) q(x,b {sub perpendicular} {sub to}){proportional_to}1/x{sup {omega}} ln{sup {omega}}(b {sub perpendicular} {sub to} {sup 2})/b {sub perpendicular} {sub to} {sup 2(1+{omega})}. This result is model independent and it is controlled completely by the all order resummed {chi}PT.

  12. Analysis of extreme drinking in patients with alcohol dependence using Pareto regression.

    Science.gov (United States)

    Das, Sourish; Harel, Ofer; Dey, Dipak K; Covault, Jonathan; Kranzler, Henry R

    2010-05-20

    We developed a novel Pareto regression model with an unknown shape parameter to analyze extreme drinking in patients with Alcohol Dependence (AD). We used the generalized linear model (GLM) framework and the log-link to include the covariate information through the scale parameter of the generalized Pareto distribution. We proposed a Bayesian method based on Ridge prior and Zellner's g-prior for the regression coefficients. Simulation study indicated that the proposed Bayesian method performs better than the existing likelihood-based inference for the Pareto regression.We examined two issues of importance in the study of AD. First, we tested whether a single nucleotide polymorphism within GABRA2 gene, which encodes a subunit of the GABA(A) receptor, and that has been associated with AD, influences 'extreme' alcohol intake and second, the efficacy of three psychotherapies for alcoholism in treating extreme drinking behavior. We found an association between extreme drinking behavior and GABRA2. We also found that, at baseline, men with a high-risk GABRA2 allele had a significantly higher probability of extreme drinking than men with no high-risk allele. However, men with a high-risk allele responded to the therapy better than those with two copies of the low-risk allele. Women with high-risk alleles also responded to the therapy better than those with two copies of the low-risk allele, while women who received the cognitive behavioral therapy had better outcomes than those receiving either of the other two therapies. Among men, motivational enhancement therapy was the best for the treatment of the extreme drinking behavior. Copyright 2010 John Wiley & Sons, Ltd.

  13. Analysis of Extreme Drinking in Patients with Alcohol Dependence Using Pareto Regression†

    Science.gov (United States)

    Das, Sourish; Harel, Ofer; Dey, Dipak K.; Covault, Jonathan; Kranzler, Henry R.

    2010-01-01

    SUMMARY We developed a novel Pareto regression model with an unknown shape parameter to analyze extreme drinking in patients with Alcohol Dependence (AD). We used the generalized linear model (GLM) framework and the log-link to include the covariate information through the scale parameter of the generalized Pareto distribution. We proposed a Bayesian method based on Ridge prior and Zellner’s g-prior for the regression coefficients. Simulation study indicated that the proposed Bayesian method performs better than the existing likelihood based inference for the Pareto regression. We examined two issues of importance in the study of AD. First, we tested whether a single nucleotide polymorphism within GABRA2 gene, which encodes a subunit of the GABAA receptor, and that has been associated to AD, influences ‘extreme’ alcohol intake and second, the efficacy of three psychotherapies for alcoholism in treating extreme drinking behavior. We found an association between extreme drinking behavior and GABRA2. We also found that, at baseline, men with a high-risk GABRA2 allele had a significantly higher probability of extreme drinking than men with no high-risk allele. However, men with a high-risk allele responded to the therapy better than those with two copies of the low-risk allele. Among men, cognitive behavioral therapy was the worst for the treatment of extreme drinking behavior. Women with high-risk alleles also responded to the therapy better than those with two copies of the low-risk allele, while women who received the cognitive behavioral therapy had better outcomes than those receiving either of the other two therapies. PMID:20225194

  14. Feasibility of identification of gamma knife planning strategies by identification of pareto optimal gamma knife plans.

    Science.gov (United States)

    Giller, C A

    2011-12-01

    The use of conformity indices to optimize Gamma Knife planning is common, but does not address important tradeoffs between dose to tumor and normal tissue. Pareto analysis has been used for this purpose in other applications, but not for Gamma Knife (GK) planning. The goal of this work is to use computer models to show that Pareto analysis may be feasible for GK planning to identify dosimetric tradeoffs. We define a GK plan A to be Pareto dominant to B if the prescription isodose volume of A covers more tumor but not more normal tissue than B, or if A covers less normal tissue but not less tumor than B. A plan is Pareto optimal if it is not dominated by any other plan. Two different Pareto optimal plans represent different tradeoffs between dose to tumor and normal tissue, because neither plan dominates the other. 'GK simulator' software calculated dose distributions for GK plans, and was called repetitively by a genetic algorithm to calculate Pareto dominant plans. Three irregular tumor shapes were tested in 17 trials using various combinations of shots. The mean number of Pareto dominant plans/trial was 59 ± 17 (sd). Different planning strategies were identified by large differences in shot positions, and 70 of the 153 coordinate plots (46%) showed differences of 5mm or more. The Pareto dominant plans dominated other nearby plans. Pareto dominant plans represent different dosimetric tradeoffs and can be systematically calculated using genetic algorithms. Automatic identification of non-intuitive planning strategies may be feasible with these methods.

  15. Pareto Optimal Solution Analysis of Convex Multi-Objective Programming Problem

    OpenAIRE

    Li Guo Zhang; Hua Zuo

    2013-01-01

    The main method of solving multi-objective programming is changing multi-objective programming problem into single objective programming problem, and then get Pareto optimal solution. Conversely, whether all Pareto optimal solutions can be obtained through appropriate method, generally the answer is negative. In this paper, the methods of norm ideal point and membership function are used to solve the multi-objective programming problem. In norm ideal point method, norm and ideal point are giv...

  16. Characterization through distributional properties of dual generalized order statistics

    Directory of Open Access Journals (Sweden)

    A.H. Khan

    2012-10-01

    Full Text Available Distributional properties of two non-adjacent dual generalized order statistics have been used to characterize distributions. Further, one sided contraction and dilation for the dual generalized order statistics are discussed and then the results are deduced for generalized order statistics, order statistics, lower record statistics, upper record statistics and adjacent dual generalized order statistics.

  17. An approach to multiobjective optimization of rotational therapy. II. Pareto optimal surfaces and linear combinations of modulated blocked arcs for a prostate geometry.

    Science.gov (United States)

    Pardo-Montero, Juan; Fenwick, John D

    2010-06-01

    The purpose of this work is twofold: To further develop an approach to multiobjective optimization of rotational therapy treatments recently introduced by the authors [J. Pardo-Montero and J. D. Fenwick, "An approach to multiobjective optimization of rotational therapy," Med. Phys. 36, 3292-3303 (2009)], especially regarding its application to realistic geometries, and to study the quality (Pareto optimality) of plans obtained using such an approach by comparing them with Pareto optimal plans obtained through inverse planning. In the previous work of the authors, a methodology is proposed for constructing a large number of plans, with different compromises between the objectives involved, from a small number of geometrically based arcs, each arc prioritizing different objectives. Here, this method has been further developed and studied. Two different techniques for constructing these arcs are investigated, one based on image-reconstruction algorithms and the other based on more common gradient-descent algorithms. The difficulty of dealing with organs abutting the target, briefly reported in previous work of the authors, has been investigated using partial OAR unblocking. Optimality of the solutions has been investigated by comparison with a Pareto front obtained from inverse planning. A relative Euclidean distance has been used to measure the distance of these plans to the Pareto front, and dose volume histogram comparisons have been used to gauge the clinical impact of these distances. A prostate geometry has been used for the study. For geometries where a blocked OAR abuts the target, moderate OAR unblocking can substantially improve target dose distribution and minimize hot spots while not overly compromising dose sparing of the organ. Image-reconstruction type and gradient-descent blocked-arc computations generate similar results. The Pareto front for the prostate geometry, reconstructed using a large number of inverse plans, presents a hockey-stick shape

  18. Kinetics of wealth and the Pareto law.

    Science.gov (United States)

    Boghosian, Bruce M

    2014-04-01

    An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the so-called Yard-Sale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its long-time limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at small values of wealth, and approximate power-law behavior at large values of wealth.

  19. Kinetics of wealth and the Pareto law

    Science.gov (United States)

    Boghosian, Bruce M.

    2014-04-01

    An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the so-called Yard-Sale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its long-time limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at small values of wealth, and approximate power-law behavior at large values of wealth.

  20. Geometric max stability of Pareto random variables

    OpenAIRE

    Juozulynaitė, Gintarė

    2010-01-01

    Šiame darbe nagrinėjau vienmačių ir dvimačių Pareto atsitiktinių dydžių geometrinį maks stabilumą. Įrodžiau, kad vienmatis Pareto skirstinys yra geometriškai maks stabilus, kai alfa=1. Tačiau nėra geometriškai maks stabilus, kai alfa nelygu 1. Naudodamasi geometrinio maks stabilumo kriterijumi dvimačiams Pareto atsitiktiniams dydžiams, įrodžiau, kad dvimatė Pareto skirstinio funkcija nėra geometriškai maks stabili, kai vektoriaus komponentės nepriklausomos (kai alfa=1, beta=1 ir alfa nelygu 1...

  1. Project management under uncertainty beyond beta: The generalized bicubic distribution

    Directory of Open Access Journals (Sweden)

    José García Pérez

    2016-01-01

    Full Text Available The beta distribution has traditionally been employed in the PERT methodology and generally used for modeling bounded continuous random variables based on expert’s judgment. The impossibility of estimating four parameters from the three values provided by the expert when the beta distribution is assumed to be the underlying distribution has been widely debated. This paper presents the generalized bicubic distribution as a good alternative to the beta distribution since, when the variance depends on the mode, the generalized bicubic distribution approximates the kurtosis of the Gaussian distribution better than the beta distribution. In addition, this distribution presents good properties in the PERT methodology in relation to moderation and conservatism criteria. Two empirical applications are presented to demonstrate the adequateness of this new distribution.

  2. Multiobjective Optimization of Linear Cooperative Spectrum Sensing: Pareto Solutions and Refinement.

    Science.gov (United States)

    Yuan, Wei; You, Xinge; Xu, Jing; Leung, Henry; Zhang, Tianhang; Chen, Chun Lung Philip

    2016-01-01

    In linear cooperative spectrum sensing, the weights of secondary users and detection threshold should be optimally chosen to minimize missed detection probability and to maximize secondary network throughput. Since these two objectives are not completely compatible, we study this problem from the viewpoint of multiple-objective optimization. We aim to obtain a set of evenly distributed Pareto solutions. To this end, here, we introduce the normal constraint (NC) method to transform the problem into a set of single-objective optimization (SOO) problems. Each SOO problem usually results in a Pareto solution. However, NC does not provide any solution method to these SOO problems, nor any indication on the optimal number of Pareto solutions. Furthermore, NC has no preference over all Pareto solutions, while a designer may be only interested in some of them. In this paper, we employ a stochastic global optimization algorithm to solve the SOO problems, and then propose a simple method to determine the optimal number of Pareto solutions under a computational complexity constraint. In addition, we extend NC to refine the Pareto solutions and select the ones of interest. Finally, we verify the effectiveness and efficiency of the proposed methods through computer simulations.

  3. Diversity comparison of Pareto front approximations in many-objective optimization.

    Science.gov (United States)

    Li, Miqing; Yang, Shengxiang; Liu, Xiaohui

    2014-12-01

    Diversity assessment of Pareto front approximations is an important issue in the stochastic multiobjective optimization community. Most of the diversity indicators in the literature were designed to work for any number of objectives of Pareto front approximations in principle, but in practice many of these indicators are infeasible or not workable when the number of objectives is large. In this paper, we propose a diversity comparison indicator (DCI) to assess the diversity of Pareto front approximations in many-objective optimization. DCI evaluates relative quality of different Pareto front approximations rather than provides an absolute measure of distribution for a single approximation. In DCI, all the concerned approximations are put into a grid environment so that there are some hyperboxes containing one or more solutions. The proposed indicator only considers the contribution of different approximations to nonempty hyperboxes. Therefore, the computational cost does not increase exponentially with the number of objectives. In fact, the implementation of DCI is of quadratic time complexity, which is fully independent of the number of divisions used in grid. Systematic experiments are conducted using three groups of artificial Pareto front approximations and seven groups of real Pareto front approximations with different numbers of objectives to verify the effectiveness of DCI. Moreover, a comparison with two diversity indicators used widely in many-objective optimization is made analytically and empirically. Finally, a parametric investigation reveals interesting insights of the division number in grid and also offers some suggested settings to the users with different preferences.

  4. Size-biased distributions in the generalized beta distribution family, with applications to forestry

    Science.gov (United States)

    Mark J. Ducey; Jeffrey H. Gove

    2015-01-01

    Size-biased distributions arise in many forestry applications, as well as other environmental, econometric, and biomedical sampling problems. We examine the size-biased versions of the generalized beta of the first kind, generalized beta of the second kind and generalized gamma distributions. These distributions include, as special cases, the Dagum (Burr Type III),...

  5. Pareto-optimal phylogenetic tree reconciliation.

    Science.gov (United States)

    Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S; Kellis, Manolis

    2014-06-15

    Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. . © The Author 2014. Published by Oxford University Press.

  6. Building a generalized distributed system model

    Science.gov (United States)

    Mukkamala, R.

    1993-01-01

    The key elements in the 1992-93 period of the project are the following: (1) extensive use of the simulator to implement and test - concurrency control algorithms, interactive user interface, and replica control algorithms; and (2) investigations into the applicability of data and process replication in real-time systems. In the 1993-94 period of the project, we intend to accomplish the following: (1) concentrate on efforts to investigate the effects of data and process replication on hard and soft real-time systems - especially we will concentrate on the impact of semantic-based consistency control schemes on a distributed real-time system in terms of improved reliability, improved availability, better resource utilization, and reduced missed task deadlines; and (2) use the prototype to verify the theoretically predicted performance of locking protocols, etc.

  7. Pareto fronts in clinical practice for pinnacle.

    Science.gov (United States)

    Janssen, Tomas; van Kesteren, Zdenko; Franssen, Gijs; Damen, Eugène; van Vliet, Corine

    2013-03-01

    Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. To generate the Pareto fronts, we used the native scripting language of Pinnacle(3) (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI(95%)) by 0.02 (P=.005), and the rectal wall V(65 Gy) by 1.1% (P=.008). We showed the feasibility of automatically generating Pareto fronts with Pinnacle(3). Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Pareto Fronts in Clinical Practice for Pinnacle

    International Nuclear Information System (INIS)

    Janssen, Tomas; Kesteren, Zdenko van; Franssen, Gijs; Damen, Eugène; Vliet, Corine van

    2013-01-01

    Purpose: Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. Methods and Materials: To generate the Pareto fronts, we used the native scripting language of Pinnacle 3 (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Results: Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI 95% ) by 0.02 (P=.005), and the rectal wall V 65 Gy by 1.1% (P=.008). Conclusions: We showed the feasibility of automatically generating Pareto fronts with Pinnacle 3 . Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT

  9. Modified Stieltjes Transform and Generalized Convolutions of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Lev B. Klebanov

    2018-01-01

    Full Text Available The classical Stieltjes transform is modified in such a way as to generalize both Stieltjes and Fourier transforms. This transform allows the introduction of new classes of commutative and non-commutative generalized convolutions. A particular case of such a convolution for degenerate distributions appears to be the Wigner semicircle distribution.

  10. Generalized parton distributions for the nucleon in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Manashov, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik]|[Sankt-Petersburg State Univ. (Russian Federation). Dept. of Theoretical Physics; Schaefer, A. [Sankt-Petersburg State Univ. (Russian Federation). Dept. of Theoretical Physics

    2006-11-15

    We complete the analysis of twist-two generalized parton distributions of the nucleon in one-loop order of heavy-baryon chiral perturbation theory. Extending our previous study of the chiral-even isosinglet sector, we give results for chiral-even isotriplet distributions and for the chiral-odd sector. We also calculate the one-loop corrections for the chiral-odd generalized parton distributions of the pion. (orig.)

  11. Generalized parton distributions for the nucleon in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchroton DESY, Theory Group, Hamburg (Germany); Manashov, A. [Universitaet Regensburg, Institut fuer Theoretische Physik, Regensburg (Germany); Sankt-Petersburg State University, Department of Theoretical Physics, St.-Petersburg (Russian Federation); Schaefer, A. [Universitaet Regensburg, Institut fuer Theoretische Physik, Regensburg (Germany)

    2007-03-15

    We complete the analysis of twist-two generalized parton distributions of the nucleon in one-loop order of heavy-baryon chiral perturbation theory. Extending our previous study of the chiral-even isosinglet sector, we give results for chiral-even isotriplet distributions and for the chiral-odd sector. We also calculate the one-loop corrections for the chiral-odd generalized parton distributions of the pion. (orig.)

  12. Post Pareto optimization-A case

    Science.gov (United States)

    Popov, Stoyan; Baeva, Silvia; Marinova, Daniela

    2017-12-01

    Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.

  13. Exponentiated Transmuted Generalized Raleigh Distribution: A New Four Parameter Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Z. A…fify

    2015-04-01

    Full Text Available This paper introduces a new four parameter Rayleigh distribution as a generalization of the transmuted generalized Rayleigh distribution introduced by Merovci (2014. The new distribution is referred to as exponentiated transmuted generalized Rayleigh distribution (ETGRD. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A two real data sets are used to compare the ‡exibility of the new model versus its sub models.

  14. TopN-Pareto Front Search

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-21

    The JMP Add-In TopN-PFS provides an automated tool for finding layered Pareto front to identify the top N solutions from an enumerated list of candidates subject to optimizing multiple criteria. The approach constructs the N layers of Pareto fronts, and then provides a suite of graphical tools to explore the alternatives based on different prioritizations of the criteria. The tool is designed to provide a set of alternatives from which the decision-maker can select the best option for their study goals.

  15. Applications of Skew Models Using Generalized Logistic Distribution

    Directory of Open Access Journals (Sweden)

    Pushpa Narayan Rathie

    2016-04-01

    Full Text Available We use the skew distribution generation procedure proposed by Azzalini [Scand. J. Stat., 1985, 12, 171–178] to create three new probability distribution functions. These models make use of normal, student-t and generalized logistic distribution, see Rathie and Swamee [Technical Research Report No. 07/2006. Department of Statistics, University of Brasilia: Brasilia, Brazil, 2006]. Expressions for the moments about origin are derived. Graphical illustrations are also provided. The distributions derived in this paper can be seen as generalizations of the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. Applications with unimodal and bimodal data are given to illustrate the applicability of the results derived in this paper. The applications include the analysis of the following data sets: (a spending on public education in various countries in 2003; (b total expenditure on health in 2009 in various countries and (c waiting time between eruptions of the Old Faithful Geyser in the Yellow Stone National Park, Wyoming, USA. We compare the fit of the distributions introduced in this paper with the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. The results show that our distributions, in general, fit better the data sets. The general R codes for fitting the distributions introduced in this paper are given in Appendix A.

  16. Tsallis distribution as a standard maximum entropy solution with 'tail' constraint

    International Nuclear Information System (INIS)

    Bercher, J.-F.

    2008-01-01

    We show that Tsallis' distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Renyi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the construction. Finally, the 'maximum entropy tail distribution' is identified as a Generalized Pareto Distribution

  17. Generalized parton distributions and deep virtual Compton scattering

    International Nuclear Information System (INIS)

    Hasell, D.; Milner, R.; Takase, K.

    2001-01-01

    A brief description of generalized parton distributions is presented together with a discussion on studying such distributions via deep virtual Compton scattering. The kinematics, estimates of rates, and accuracies achievable for measuring DVCS utilizing a 5+50 GeV ep collider are also provided

  18. Generalized Stacy-Lindley mixture distribution | Maya | Afrika Statistika

    African Journals Online (AJOL)

    In this paper, we introduce a five parameter extension of mixture of two Stacy gamma distributions called generalized Stacy- Lindley mixture distribution. Several statistical properties are derived. Two types of estimation techniques are used for estimating the parameters. Asymptotic confidence interval is also calculated for ...

  19. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  20. How Well Do We Know Pareto Optimality?

    Science.gov (United States)

    Mathur, Vijay K.

    1991-01-01

    Identifies sources of ambiguity in economics textbooks' discussion of the condition for efficient output mix. Points out that diverse statements without accompanying explanations create confusion among students. Argues that conflicting views concerning the concept of Pareto optimality as one source of ambiguity. Suggests clarifying additions to…

  1. Performance-based Pareto optimal design

    NARCIS (Netherlands)

    Sariyildiz, I.S.; Bittermann, M.S.; Ciftcioglu, O.

    2008-01-01

    A novel approach for performance-based design is presented, where Pareto optimality is pursued. Design requirements may contain linguistic information, which is difficult to bring into computation or make consistent their impartial estimations from case to case. Fuzzy logic and soft computing are

  2. Generalized parton distributions for the pion in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M.; Manashov, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik]|[Sankt-Peterburgskij Univ., St. Petersburg (Russian Federation). Kafedra Teoreticheskoj Fiziki; Schaefer, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik

    2005-05-01

    Generalized parton distributions provide a unified parameterization of hadron structure and allow one to combine information from many different observables. Lattice QCD calculations already provide important input to determine these distributions and hold the promise to become even more important in the future. To this end, a reliable extrapolation of lattice calculations to the physical quark and pion masses is needed. We present an analysis for the moments of generalized parton distributions of the pion in one-loop order of chiral perturbation theory. (orig.)

  3. Generalized parton distributions for the pion in chiral perturbation theory

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchroton DESY, D-22603 Hamburg (Germany); Manashov, A. [Institut fuer Theoretische Physik, Universitaet Regensburg, D-93040 Regensburg (Germany) and Department of Theoretical Physics, Sankt-Petersburg State University, St. Petersburg (Russian Federation)]. E-mail: alexander.manashov@physik.uni-regensburg.de; Schaefer, A. [Institut fuer Theoretische Physik, Universitaet Regensburg, D-93040 Regensburg (Germany)

    2005-08-25

    Generalized parton distributions provide a unified parameterization of hadron structure and allow one to combine information from many different observables. Lattice QCD calculations already provide important input to determine these distributions and hold the promise to become even more important in the future. To this end, a reliable extrapolation of lattice calculations to the physical quark and pion masses is needed. We present an analysis for the moments of generalized parton distributions of the pion in one-loop order of chiral perturbation theory.

  4. The Pareto Analysis for Establishing Content Criteria in Surgical Training.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-01-01

    Current surgical training is still highly dependent on expensive operating room (OR) experience. Although there have been many attempts to transfer more training to the skills laboratory, little research is focused on which technical behaviors can lead to the highest profit when they are trained outside the OR. The Pareto principle states that in any population that contributes to a common effect, a few account for the bulk of the effect. This principle has been widely used in business management to increase company profits. This study uses the Pareto principle for establishing content criteria for more efficient surgical training. A retrospective study was conducted to assess verbal guidance provided by 9 supervising surgeons to 12 trainees performing 64 laparoscopic cholecystectomies in the OR. The verbal corrections were documented, tallied, and clustered according to the aimed change in novice behavior. The corrections were rank ordered, and a cumulative distribution curve was used to calculate which corrections accounted for 80% of the total number of verbal corrections. In total, 253 different verbal corrections were uttered 1587 times and were categorized into 40 different clusters of aimed changes in novice behaviors. The 35 highest-ranking verbal corrections (14%) and the 11 highest-ranking clusters (28%) accounted for 80% of the total number of given verbal corrections. Following the Pareto principle, we were able to identify the aspects of trainee behavior that account for most corrections given by supervisors during a laparoscopic cholecystectomy on humans. This strategy can be used for the development of new training programs to prepare the trainee in advance for the challenges encountered in the clinical setting in an OR. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Improving patient activity schedules by multi-agent Pareto appointment exchanging

    NARCIS (Netherlands)

    I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2006-01-01

    textabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem: the multi-agent Pareto-improvement appointment exchanging algorithm, MPAEX. It respects the decentralization of scheduling authorities and is capable of continuously adjusting the different patient

  6. Determination of Pareto frontier in multi-objective maintenance optimization

    International Nuclear Information System (INIS)

    Certa, Antonella; Galante, Giacomo; Lupo, Toni; Passannanti, Gianfranco

    2011-01-01

    The objective of a maintenance policy generally is the global maintenance cost minimization that involves not only the direct costs for both the maintenance actions and the spare parts, but also those ones due to the system stop for preventive maintenance and the downtime for failure. For some operating systems, the failure event can be dangerous so that they are asked to operate assuring a very high reliability level between two consecutive fixed stops. The present paper attempts to individuate the set of elements on which performing maintenance actions so that the system can assure the required reliability level until the next fixed stop for maintenance, minimizing both the global maintenance cost and the total maintenance time. In order to solve the previous constrained multi-objective optimization problem, an effective approach is proposed to obtain the best solutions (that is the Pareto optimal frontier) among which the decision maker will choose the more suitable one. As well known, describing the whole Pareto optimal frontier generally is a troublesome task. The paper proposes an algorithm able to rapidly overcome this problem and its effectiveness is shown by an application to a case study regarding a complex series-parallel system.

  7. Evaluation of Preanalytical Quality Indicators by Six Sigma and Pareto`s Principle.

    Science.gov (United States)

    Kulkarni, Sweta; Ramesh, R; Srinivasan, A R; Silvia, C R Wilma Delphine

    2018-01-01

    Preanalytical steps are the major sources of error in clinical laboratory. The analytical errors can be corrected by quality control procedures but there is a need for stringent quality checks in preanalytical area as these processes are done outside the laboratory. Sigma value depicts the performance of laboratory and its quality measures. Hence in the present study six sigma and Pareto principle was applied to preanalytical quality indicators to evaluate the clinical biochemistry laboratory performance. This observational study was carried out for a period of 1 year from November 2015-2016. A total of 1,44,208 samples and 54,265 test requisition forms were screened for preanalytical errors like missing patient information, sample collection details in forms and hemolysed, lipemic, inappropriate, insufficient samples and total number of errors were calculated and converted into defects per million and sigma scale. Pareto`s chart was drawn using total number of errors and cumulative percentage. In 75% test requisition forms diagnosis was not mentioned and sigma value of 0.9 was obtained and for other errors like sample receiving time, stat and type of sample sigma values were 2.9, 2.6, and 2.8 respectively. For insufficient sample and improper ratio of blood to anticoagulant sigma value was 4.3. Pareto`s chart depicts out of 80% of errors in requisition forms, 20% is contributed by missing information like diagnosis. The development of quality indicators, application of six sigma and Pareto`s principle are quality measures by which not only preanalytical, the total testing process can be improved.

  8. A new generalization of the Pareto–geometric distribution

    Directory of Open Access Journals (Sweden)

    M. Nassar

    2013-07-01

    Full Text Available In this paper we introduce a new distribution called the beta Pareto–geometric. We provide a comprehensive treatment of the mathematical properties of the proposed distribution and derive expressions for its moment generating function and the rth generalized moment. We discuss estimation of the parameters by maximum likelihood and obtain the information matrix that is easily numerically determined. We also demonstrate its usefulness on a real data set.

  9. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.

    Science.gov (United States)

    Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.

  10. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2011-01-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  11. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    Science.gov (United States)

    Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2011-01-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans. PMID:21610294

  12. Skew-Type I Generalized Logistic Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Abdallah Mohamed Abdelfattah

    2015-09-01

    Full Text Available Generalized logistic distributions are very versatile and give useful representations of many physical situations. Skew-symmetric densities recently received much attention in the literature. In this paper, we introduce a new class of skew-symmetric distributions which are formulated based on cumulative distributions of skew-symmetric densities. We derive, the probability density function (pdf and cumulative distribution function (CDF of the skew type I generalized logistic distribution denoted by S'GLD . The general statistical properties of the S'GLD such as: the moment generating function (mgf, characteristic function (chf, Laplace and Fourier transformations are obtained in explicit form. Expressions for the nth moment, skewness and kurtosis are discussed. Mean deviation about the mean and about the median, Renye entropy and the order statistics are also given. We consider the general case by inclusion of location and scale parameters. The results of Nadarajah (2009 are obtained as special cases. Graphical illustration of some results has been represented. Further we present a numerical example to illustrate some results of this paper.

  13. Pareto front estimation for decision making.

    Science.gov (United States)

    Giagkiozis, Ioannis; Fleming, Peter J

    2014-01-01

    The set of available multi-objective optimisation algorithms continues to grow. This fact can be partially attributed to their widespread use and applicability. However, this increase also suggests several issues remain to be addressed satisfactorily. One such issue is the diversity and the number of solutions available to the decision maker (DM). Even for algorithms very well suited for a particular problem, it is difficult-mainly due to the computational cost-to use a population large enough to ensure the likelihood of obtaining a solution close to the DM's preferences. In this paper we present a novel methodology that produces additional Pareto optimal solutions from a Pareto optimal set obtained at the end run of any multi-objective optimisation algorithm for two-objective and three-objective problem instances.

  14. Multiclass gene selection using Pareto-fronts.

    Science.gov (United States)

    Rajapakse, Jagath C; Mundra, Piyushkumar A

    2013-01-01

    Filter methods are often used for selection of genes in multiclass sample classification by using microarray data. Such techniques usually tend to bias toward a few classes that are easily distinguishable from other classes due to imbalances of strong features and sample sizes of different classes. It could therefore lead to selection of redundant genes while missing the relevant genes, leading to poor classification of tissue samples. In this manuscript, we propose to decompose multiclass ranking statistics into class-specific statistics and then use Pareto-front analysis for selection of genes. This alleviates the bias induced by class intrinsic characteristics of dominating classes. The use of Pareto-front analysis is demonstrated on two filter criteria commonly used for gene selection: F-score and KW-score. A significant improvement in classification performance and reduction in redundancy among top-ranked genes were achieved in experiments with both synthetic and real-benchmark data sets.

  15. Design Optimization for Superconducting Bending Magnets using Pareto Front Curve

    Science.gov (United States)

    Murata, Yukihiro; Abe, Mitsushi; Ando, Ryuya

    2017-09-01

    A novel limit design method for superconducting magnets is presented. It is particularly suitable for ion core magnets such as those used in accelerator magnets. In general, a stochastic optimization whose objective functions consist of values, e.g., the magnetic field, experience field of superconducting coils, current density, and multipole field integral, is often used. However, it is well known that the obtained solution strongly depends on the initial one. Furthermore, once the calculation model is fixed, the range of solutions is also fixed, i.e., there are times when it may be impossible to find the global optimum solution even with a lot of parameter sweeps. In this study, we draw the Pareto front curve to obtain the range and infer whether the solution is an optimum one. In addition, the Pareto front curve indicates the neighborhood solution that is substituted for the initial one. After this process a stochastic optimization is implemented with its initial design parameters. To confirm the validity, we designed a superconducting bending magnet, and it showed that this method works well.

  16. The size distributions of all Indian cities

    Science.gov (United States)

    Luckstead, Jeff; Devadoss, Stephen; Danforth, Diana

    2017-05-01

    We apply five distributions-lognormal, double-Pareto lognormal, lognormal-upper tail Pareto, Pareto tails-lognormal, and Pareto tails-lognormal with differentiability restrictions-to estimate the size distribution of all Indian cities. Since India contains numerous small cities, it is important to explicitly model the lower-tail behavior for studying the distribution of all Indian cities. Our results rigorously confirm, using both graphical and formal statistical tests, that among these five distributions, Pareto tails-lognormal is a better suited parametrization of the Indian city size data, verifying that the Indian city size distribution exhibits a strong reverse Pareto in the lower tail, lognormal in the mid-range body, and Pareto in the upper tail.

  17. Distributed Systems of Generalizing as the Basis of Workplace Learning

    Science.gov (United States)

    Virkkunen, Jaakko; Pihlaja, Juha

    2004-01-01

    This article proposes a new way of conceptualizing workplace learning as distributed systems of appropriation, development and the use of practice-relevant generalizations fixed within mediational artifacts. This article maintains that these systems change historically as technology and increasingly sophisticated forms of production develop.…

  18. A generalized Dirichlet distribution accounting for singularities of the variables

    DEFF Research Database (Denmark)

    Lewy, Peter

    1996-01-01

    A multivariate generalized Dirichlet distribution has been formulated for the case where the stochastic variables are allowed to have singularities at 0 and 1. Small sample properties of the estimates of moments of the variables based on maximum likelihood estimates of the parameters have been co...

  19. Mathematical Modeling of Avidity Distribution and Estimating General Binding Properties of Transcription Factors from Genome-Wide Binding Profiles.

    Science.gov (United States)

    Kuznetsov, Vladimir A

    2017-01-01

    The shape of the experimental frequency distributions (EFD) of diverse molecular interaction events quantifying genome-wide binding is often skewed to the rare but abundant quantities. Such distributions are systematically deviated from standard power-law functions proposed by scale-free network models suggesting that more explanatory and predictive probabilistic model(s) are needed. Identification of the mechanism-based data-driven statistical distributions that provide an estimation and prediction of binding properties of transcription factors from genome-wide binding profiles is the goal of this analytical survey. Here, we review and develop an analytical framework for modeling, analysis, and prediction of transcription factor (TF) DNA binding properties detected at the genome scale. We introduce a mixture probabilistic model of binding avidity function that includes nonspecific and specific binding events. A method for decomposition of specific and nonspecific TF-DNA binding events is proposed. We show that the Kolmogorov-Waring (KW) probability function (PF), modeling the steady state TF binding-dissociation stochastic process, fits well with the EFD for diverse TF-DNA binding datasets. Furthermore, this distribution predicts total number of TF-DNA binding sites (BSs), estimating specificity and sensitivity as well as other basic statistical features of DNA-TF binding when the experimental datasets are noise-rich and essentially incomplete. The KW distribution fits equally well to TF-DNA binding activity for different TFs including ERE, CREB, STAT1, Nanog, and Oct4. Our analysis reveals that the KW distribution and its generalized form provides the family of power-law-like distributions given in terms of hypergeometric series functions, including standard and generalized Pareto and Waring distributions, providing flexible and common skewed forms of the transcription factor binding site (TFBS) avidity distribution function. We suggest that the skewed binding

  20. RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.

    Science.gov (United States)

    Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A

    2013-12-01

    Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.

  1. Vilfredo Pareto e la fine del Sociale

    Directory of Open Access Journals (Sweden)

    Francesco Antonelli

    2017-08-01

    posto di quella di Stato o di pubblico ambisce a ricostruire un sociale che si auto-governa rompendo con il capitalismo globale: Toni Negri (2003 e Paolo Virno (2014 e, più in generale i post-operaisti ma anche una parte dei foucaultiani, si riconoscono in questa visione radical. Una delle radici culturali di questa vittoria teorico-pratica dell’individuo sulla Società è rintracciabile negli studi di Vilfredo Pareto. Prendere in considerazione le sue posizioni risulta importante anche per comprenderne le ambiguità. Nel primo paragrafo ci concentreremo sulle sue posizioni giovanili per poi passare ad analizzare quanto da egli sostenuto nel Trattato di sociologia generale (1916. Infine, nelle conclusioni cercheremo di sviluppare alcune considerazioni riferite ai discorsi contemporanei centrati sul soggetto.

  2. Topology Identification of General Dynamical Network with Distributed Time Delays

    International Nuclear Information System (INIS)

    Zhao-Yan, Wu; Xin-Chu, Fu

    2009-01-01

    General dynamical networks with distributed time delays are studied. The topology of the networks are viewed as unknown parameters, which need to be identified. Some auxiliary systems (also called the network estimators) are designed to achieve this goal. Both linear feedback control and adaptive strategy are applied in designing these network estimators. Based on linear matrix inequalities and the Lyapunov function method, the sufficient condition for the achievement of topology identification is obtained. This method can also better monitor the switching topology of dynamical networks. Illustrative examples are provided to show the effectiveness of this method. (general)

  3. Pareto Optimal Solutions for Network Defense Strategy Selection Simulator in Multi-Objective Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Yang Sun

    2018-01-01

    Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.

  4. Deeply Pseudoscalar Meson Electroproduction with CLAS and Generalized Parton Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Guidal, Michel [Institut de Physique Nucleaire, Orsay (France); Kubarovsky, Valery P. [Jefferson Lab., Newport News, VA (United States)

    2015-06-01

    We discuss the recent data of exclusive $\\pi^0$ (and $\\pi^+$) electroproduction on the proton obtained by the CLAS collaboration at Jefferson Lab. It is observed that the cross sections, which have been decomposed in $\\sigma_T+\\epsilon\\sigma_L$, $\\sigma_{TT}$ and $\\sigma_{LT}$ structure functions, are dominated by transverse amplitude contributions. The data can be interpreted in the Generalized Parton Distribution formalism provided that one includes helicity-flip transversity GPDs.

  5. QCD Sum Rules and Models for Generalized Parton Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Anatoly Radyushkin

    2004-10-01

    I use QCD sum rule ideas to construct models for generalized parton distributions. To this end, the perturbative parts of QCD sum rules for the pion and nucleon electromagnetic form factors are interpreted in terms of GPDs and two models are discussed. One of them takes the double Borel transform at adjusted value of the Borel parameter as a model for nonforward parton densities, and another is based on the local duality relation. Possible ways of improving these Ansaetze are briefly discussed.

  6. Chiral perturbation theory for nucleon generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Manashov, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik]|[Sankt-Petersburg State Univ. (Russian Federation). Dept. of Theoretical Physics; Schaefer, A. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik

    2006-08-15

    We analyze the moments of the isosinglet generalized parton distributions H, E, H, E of the nucleon in one-loop order of heavy-baryon chiral perturbation theory. We discuss in detail the construction of the operators in the effective theory that are required to obtain all corrections to a given order in the chiral power counting. The results will serve to improve the extrapolation of lattice results to the chiral limit. (orig.)

  7. Chiral perturbation theory for nucleon generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, M. [Deutsches Elektronen-Synchroton DESY, Theory Group, Hamburg (Germany); Manashov, A. [Universitaet Regensburg, Institut fuer Theoretische Physik, Regensburg (Germany); Schaefer, A. [Sankt-Petersburg State University, Department of Theoretical Physics, St.-Petersburg (Russian Federation)

    2006-09-15

    We analyze the moments of the isosinglet generalized parton distributions H, E, H, E of the nucleon in one-loop order of heavy-baryon chiral perturbation theory. We discuss in detail the construction of the operators in the effective theory that are required to obtain all corrections to a given order in the chiral power counting. The results will serve to improve the extrapolation of lattice results to the chiral limit. (orig.)

  8. Partial Generalized Probability Weighted Moments for Exponentiated Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Neema Mohamed El Haroun

    2015-09-01

    Full Text Available The generalized probability weighted moments are widely used in hydrology for estimating parameters of flood distributions from complete sample. The method of partial generalized probability weighted moments was used to estimate the parameters of distributions from censored sample. This article offers new method called partial generalized probability weighted moments (PGPWMs for the analysis of censored data. The method of PGPWMs is an extended class from partial generalized probability weighted moments. To illustrate the new method, estimation of the unknown parameters from exponentiated exponential distribution based on doubly censored sample is considered. PGPWMs estimators for right and left censored samples are obtained as special cases.   Simulation study is conducted to investigate performance of estimates for exponentiated exponential distribution. Comparison between estimators is made through simulation via their biases and  mean square errors. An illustration with real data is provided. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;}

  9. Citation distribution profile in Brazilian journals of general medicine.

    Science.gov (United States)

    Lustosa, Luiggi Araujo; Chalco, Mario Edmundo Pastrana; Borba, Cecília de Melo; Higa, André Eizo; Almeida, Renan Moritz Varnier Rodrigues

    2012-01-01

    Impact factors are currently the bibliometric index most used for evaluating scientific journals. However, the way in which they are used, for instance concerning the study or journal types analyzed, can markedly interfere with estimate reliability. This study aimed to analyze the citation distribution pattern in three Brazilian journals of general medicine. This was a descriptive study based on numbers of citations of scientific studies published by three Brazilian journals of general medicine. The journals analyzed were São Paulo Medical Journal, Clinics and Revista da Associação Médica Brasileira. This survey used data available from the Institute for Scientific Information (ISI) platform, from which the total number of papers published in each journal in 2007-2008 and the number of citations of these papers in 2009 were obtained. From these data, the citation distribution was derived and journal impact factors (average number of citations) were estimated. These factors were then compared with those directly available from the ISI Journal of Citation Reports (JCR). Respectively, 134, 203 and 192 papers were published by these journals during the period analyzed. The observed citation distributions were highly skewed, such that many papers had few citations and a small percentage had many citations. It was not possible to identify any specific pattern for the most cited papers or to exactly reproduce the JCR impact factors. Use of measures like "impact factors", which characterize citations through averages, does not adequately represent the citation distribution in the journals analyzed.

  10. Pareto Optimal Design for Synthetic Biology.

    Science.gov (United States)

    Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe

    2015-08-01

    Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.

  11. A Pareto-Improving Minimum Wage

    OpenAIRE

    Danziger, Eliav; Danziger, Leif

    2014-01-01

    This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...

  12. Pareto-Optimal Estimates of California Precipitation Change

    Science.gov (United States)

    Langenbrunner, Baird; Neelin, J. David

    2017-12-01

    In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.

  13. Pareto optimality in infinite horizon linear quadratic differential games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2013-01-01

    In this article we derive conditions for the existence of Pareto optimal solutions for linear quadratic infinite horizon cooperative differential games. First, we present a necessary and sufficient characterization for Pareto optimality which translates to solving a set of constrained optimal

  14. Pareto Improving Price Regulation when the Asset Market is Incomplete

    NARCIS (Netherlands)

    Herings, P.J.J.; Polemarchakis, H.M.

    1999-01-01

    When the asset market is incomplete, competitive equilibria are constrained suboptimal, which provides a scope for pareto improving interventions. Price regulation can be such a pareto improving policy, even when the welfare effects of rationing are taken into account. An appealing aspect of price

  15. Pareto 80/20 Law: Derivation via Random Partitioning

    Science.gov (United States)

    Lipovetsky, Stan

    2009-01-01

    The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…

  16. Pareto-front shape in multiobservable quantum control

    Science.gov (United States)

    Sun, Qiuyang; Wu, Re-Bing; Rabitz, Herschel

    2017-03-01

    Many scenarios in the sciences and engineering require simultaneous optimization of multiple objective functions, which are usually conflicting or competing. In such problems the Pareto front, where none of the individual objectives can be further improved without degrading some others, shows the tradeoff relations between the competing objectives. This paper analyzes the Pareto-front shape for the problem of quantum multiobservable control, i.e., optimizing the expectation values of multiple observables in the same quantum system. Analytic and numerical results demonstrate that with two commuting observables the Pareto front is a convex polygon consisting of flat segments only, while with noncommuting observables the Pareto front includes convexly curved segments. We also assess the capability of a weighted-sum method to continuously capture the points along the Pareto front. Illustrative examples with realistic physical conditions are presented, including NMR control experiments on a 1H-13C two-spin system with two commuting or noncommuting observables.

  17. Citation distribution profile in Brazilian journals of general medicine

    Directory of Open Access Journals (Sweden)

    Luiggi Araujo Lustosa

    Full Text Available CONTEXT AND OBJECTIVE: Impact factors are currently the bibliometric index most used for evaluating scientific journals. However, the way in which they are used, for instance concerning the study or journal types analyzed, can markedly interfere with estimate reliability. This study aimed to analyze the citation distribution pattern in three Brazilian journals of general medicine. DESIGN AND SETTING: This was a descriptive study based on numbers of citations of scientific studies published by three Brazilian journals of general medicine. METHODS: The journals analyzed were São Paulo Medical Journal, Clinics and Revista da Associação Médica Brasileira. This survey used data available from the Institute for Scientific Information (ISI platform, from which the total number of papers published in each journal in 2007-2008 and the number of citations of these papers in 2009 were obtained. From these data, the citation distribution was derived and journal impact factors (average number of citations were estimated. These factors were then compared with those directly available from the ISI Journal of Citation Reports (JCR. RESULTS: Respectively, 134, 203 and 192 papers were published by these journals during the period analyzed. The observed citation distributions were highly skewed, such that many papers had few citations and a small percentage had many citations. It was not possible to identify any specific pattern for the most cited papers or to exactly reproduce the JCR impact factors. CONCLUSION: Use of measures like "impact factors", which characterize citations through averages, does not adequately represent the citation distribution in the journals analyzed.

  18. Helicity-dependent generalized parton distributions for nonzero skewness

    Energy Technology Data Exchange (ETDEWEB)

    Mondal, Chandan [Chinese Academy of Sciences, Institute of Modern Physics, Lanzhou (China)

    2017-09-15

    We investigate the helicity-dependent generalized parton distributions (GPDs) in momentum as well as transverse position (impact) spaces for the u and d quarks in a proton when the momentum transfer in both the transverse and the longitudinal directions are nonzero. The GPDs are evaluated using the light-front wave functions of a quark-diquark model for nucleon where the wave functions are constructed by the soft-wall AdS/QCD correspondence. We also express the GPDs in the boost-invariant longitudinal position space. (orig.)

  19. Robust Bayesian Analysis of Generalized Half Logistic Distribution

    Directory of Open Access Journals (Sweden)

    Ajit Chaturvedi

    2017-06-01

    Full Text Available In this paper, Robust Bayesian analysis of the generalized half logistic distribution (GHLD under an $\\epsilon$-contamination class of priors for the shape parameter $\\lambda$ is considered. ML-II Bayes estimators of the parameters, reliability function and hazard function are derived under the squared-error loss function (SELF and linear exponential (LINEX loss function by considering the Type~II censoring and the sampling scheme of Bartholomew (1963. Both the cases when scale parameter is known and unknown is considered under Type~II censoring and under the sampling scheme of Bartholomew. Simulation study and analysis of a real data set are presented.

  20. Generalized reorientation cross section for cylindrically symmetric velocity distributions

    International Nuclear Information System (INIS)

    Generalized reorientation cross sections are derived for the case of atom--molecule collisions where the molecules initially have a velocity distribution cylindrically symmetric about an axis in the laboratory reference frame. This spatial ordering of the velocity can come about, for instance, by exciting molecular electronic states with a light source whose linewidth is much narrower than the Doppler-broadened absorption line. A simple kinetic theory can be set up in terms of state multipoles that are not completely irreducible; the resulting reorientation cross sections are only slightly more complex than the cross sections occurring in a spherically symmetric velocity field. Two approximations are investigated: a McGuire--Kouri m/sub j/-conserving model and a semiclassical model where the orientation of the rotation plane is conserved. The import of the generalized cross sections for several types of experiment and the applicability of the approximate models are discussed

  1. Modeling the brain morphology distribution in the general aging population

    Science.gov (United States)

    Huizinga, W.; Poot, D. H. J.; Roshchupkin, G.; Bron, E. E.; Ikram, M. A.; Vernooij, M. W.; Rueckert, D.; Niessen, W. J.; Klein, S.

    2016-03-01

    Both normal aging and neurodegenerative diseases such as Alzheimer's disease cause morphological changes of the brain. To better distinguish between normal and abnormal cases, it is necessary to model changes in brain morphology owing to normal aging. To this end, we developed a method for analyzing and visualizing these changes for the entire brain morphology distribution in the general aging population. The method is applied to 1000 subjects from a large population imaging study in the elderly, from which 900 were used to train the model and 100 were used for testing. The results of the 100 test subjects show that the model generalizes to subjects outside the model population. Smooth percentile curves showing the brain morphology changes as a function of age and spatiotemporal atlases derived from the model population are publicly available via an interactive web application at agingbrain.bigr.nl.

  2. Derivazioni, ripetizione, manipolazione: note sulla recezione implicita di Vilfredo Pareto negli Stati Uniti

    Directory of Open Access Journals (Sweden)

    Angela Maria Zocchi

    2017-08-01

    ò, però, sulla recezione esplicita, ad esempio quella di Parsons o quella critica di Wright Mills, bensì su quella implicita considerando, in particolare, un interessante testo di un famoso linguista americano: La libertà di chi? di George Lakoff. Scopo del lavoro è mettere in luce la prossimità teorica tra la struttura argomentativa di questo testo sulla libertà e alcune parti del Trattato di Sociologia Generale di Vilfredo Pareto, che, nonostante si presenti all’apparenza «come un’immensa massa di fatti e di teorie, in un disordine formale notevole» (Bousquet 1954, p. XI, non smette di sorprendere il lettore per la sua attualità (cfr. Mongardini 2009 e per la lucidità con cui riesce a mettere a fuoco alcune fondamentali strategie di manipolazione.

  3. Optimal transmitter power of an intersatellite optical communication system with reciprocal Pareto fading.

    Science.gov (United States)

    Liu, Xian

    2010-02-10

    This paper shows that optical signal transmission over intersatellite links with swaying transmitters can be described as an equivalent fading model. In this model, the instantaneous signal-to-noise ratio is stochastic and follows the reciprocal Pareto distribution. With this model, we show that the transmitter power can be minimized, subject to a specified outage probability, by appropriately adjusting some system parameters, such as the transmitter gain.

  4. Microergodicity effects on ebullition of methane modelled by Mixed Poisson process with Pareto mixing variable

    Czech Academy of Sciences Publication Activity Database

    Jordanova, P.; Dušek, Jiří; Stehlík, M.

    2013-01-01

    Roč. 128, OCT 15 (2013), s. 124-134 ISSN 0169-7439 R&D Projects: GA ČR(CZ) GAP504/11/1151; GA MŠk(CZ) ED1.1.00/02.0073 Institutional support: RVO:67179843 Keywords : environmental chemistry * ebullition of methane * mixed poisson processes * renewal process * pareto distribution * moving average process * robust statistics * sedge–grass marsh Subject RIV: EH - Ecology, Behaviour Impact factor: 2.381, year: 2013

  5. Simultaneous navigation of multiple Pareto surfaces, with an application to multicriteria IMRT planning with multiple beam angle configurations.

    Science.gov (United States)

    Craft, David; Monz, Michael

    2010-02-01

    To introduce a method to simultaneously explore a collection of Pareto surfaces. The method will allow radiotherapy treatment planners to interactively explore treatment plans for different beam angle configurations as well as different treatment modalities. The authors assume a convex optimization setting and represent the Pareto surface for each modality or given beam set by a set of discrete points on the surface. Weighted averages of these discrete points produce a continuous representation of each Pareto surface. The authors calculate a set of Pareto surfaces and use linear programming to navigate across the individual surfaces, allowing switches between surfaces. The switches are organized such that the plan profits in the requested way, while trying to keep the change in dose as small as possible. The system is demonstrated on a phantom pancreas IMRT case using 100 different five beam configurations and a multicriteria formulation with six objectives. The system has intuitive behavior and is easy to control. Also, because the underlying linear programs are small, the system is fast enough to offer real-time exploration for the Pareto surfaces of the given beam configurations. The system presented offers a sound starting point for building clinical systems for multicriteria exploration of different modalities and offers a controllable way to explore hundreds of beam angle configurations in IMRT planning, allowing the users to focus their attention on the dose distribution and treatment planning objectives instead of spending excessive time on the technicalities of delivery.

  6. A Comparison of Generalized Hyperbolic Distribution Models for Equity Returns

    Directory of Open Access Journals (Sweden)

    Virginie Konlack Socgnia

    2014-01-01

    Full Text Available We discuss the calibration of the univariate and multivariate generalized hyperbolic distributions, as well as their hyperbolic, variance gamma, normal inverse Gaussian, and skew Student’s t-distribution subclasses for the daily log-returns of seven of the most liquid mining stocks listed on the Johannesburg Stocks Exchange. To estimate the model parameters from historic distributions, we use an expectation maximization based algorithm for the univariate case and a multicycle expectation conditional maximization estimation algorithm for the multivariate case. We assess the goodness of fit statistics using the log-likelihood, the Akaike information criterion, and the Kolmogorov-Smirnov distance. Finally, we inspect the temporal stability of parameters and note implications as criteria for distinguishing between models. To better understand the dependence structure of the stocks, we fit the MGHD and subclasses to both the stock returns and the two leading principal components derived from the price data. While the MGHD could fit both data subsets, we observed that the multivariate normality of the stock return residuals, computed by removing shared components, suggests that the departure from normality can be explained by the structure in the common factors.

  7. A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Hájek, J.; Szöllös, A.; Šístek, Jakub

    2010-01-01

    Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro- genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451

  8. Projections onto the Pareto surface in multicriteria radiation therapy optimization

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-01-01

    Purpose: To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. Methods: The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose–volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. Results: The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose–volume histogram constraints were used. No consistent improvements in target homogeneity were observed. Conclusions: There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan

  9. Projections onto the Pareto surface in multicriteria radiation therapy optimization.

    Science.gov (United States)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-10-01

    To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose-volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose-volume histogram constraints were used. No consistent improvements in target homogeneity were observed. There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan.

  10. Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach.

    Science.gov (United States)

    Huang, Adam; Li, Jiang; Summers, Ronald M; Petrick, Nicholas; Hara, Amy K

    2010-03-21

    We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (pPareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms.

  11. Towards a global estimate for generalized parton distributions

    International Nuclear Information System (INIS)

    Lautenschlager, Tobias

    2015-01-01

    In this thesis, we give an extensive review of the phenomenology of generalized parton distributions (GPDs) utilizing the perturbative framework. Starting from basic principles, we derive the differential cross sections of deeply virtual Compton scattering and deeply virtual meson production (DVMP) in the twist-2 approximation. A special focus lies on the hard scattering amplitudes of DVMP at NLO of perturbation theory. The framework for the global analysis of GPDs relies on the uses of conformal symmetry. We give a short introduction leading to the Mellin-Barnes representation of the hard scattering amplitudes. We then derive the imaginary parts of the hard scattering amplitudes of DVMP. We utilize probability theory as extended logic to estimates GPDs. Therefore, we derive the formulas for the inference from the product and the sum rule. Afterwards, we present the results for the GPDs.

  12. Nucleon form factors, generalized parton distributions and quark angular momentum

    International Nuclear Information System (INIS)

    Diehl, Markus; Kroll, Peter; Regensburg Univ.

    2013-02-01

    We extract the individual contributions from u and d quarks to the Dirac and Pauli form factors of the proton, after a critical examination of the available measurements of electromagnetic nucleon form factors. From this data we determine generalized parton distributions for valence quarks, assuming a particular form for their functional dependence. The result allows us to study various aspects of nucleon structure in the valence region. In particular, we evaluate Ji's sum rule and estimate the total angular momentum carried by valence quarks at the scale μ=2 GeV to be J u v =0.230 +0.009 -0.024 and J d v =-0.004 +0.010 -0.016 .

  13. Nucleon form factors, generalized parton distributions and quark angular momentum

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, Markus [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Kroll, Peter [Bergische Univ., Wuppertal (Germany). Fachbereich Physik; Regensburg Univ. (Germany). Institut fuer Theoretische Physik

    2013-02-15

    We extract the individual contributions from u and d quarks to the Dirac and Pauli form factors of the proton, after a critical examination of the available measurements of electromagnetic nucleon form factors. From this data we determine generalized parton distributions for valence quarks, assuming a particular form for their functional dependence. The result allows us to study various aspects of nucleon structure in the valence region. In particular, we evaluate Ji's sum rule and estimate the total angular momentum carried by valence quarks at the scale {mu}=2 GeV to be J{sup u}{sub v}=0.230{sup +0.009}{sub -0.024} and J{sup d}{sub v}=-0.004{sup +0.010}{sub -0.016}.

  14. Residual distribution for general time-dependent conservation laws

    International Nuclear Information System (INIS)

    Ricchiuto, Mario; Csik, Arpad; Deconinck, Herman

    2005-01-01

    We consider the second-order accurate numerical solution of general time-dependent hyperbolic conservation laws over unstructured grids in the framework of the Residual Distribution method. In order to achieve full conservation of the linear, monotone and first-order space-time schemes of (Csik et al., 2003) and (Abgrall et al., 2000), we extend the conservative residual distribution (CRD) formulation of (Csik et al., 2002) to prismatic space-time elements. We then study the design of second-order accurate and monotone schemes via the nonlinear mapping of the local residuals of linear monotone schemes. We derive sufficient and necessary conditions for the well-posedness of the mapping. We prove that the schemes obtained with the CRD formulation satisfy these conditions by construction. Thus the nonlinear schemes proposed in this paper are always well defined. The performance of the linear and nonlinear schemes are evaluated on a series of test problems involving the solution of the Euler equations and of a two-phase flow model. We consider the resolution of strong shocks and complex interacting flow structures. The results demonstrate the robustness, accuracy and non-oscillatory character of the proposed schemes. d schemes

  15. Chiral perturbation theory for generalized parton distributions and baryon distribution amplitudes

    Energy Technology Data Exchange (ETDEWEB)

    Wein, Philipp

    2016-05-06

    In this thesis we apply low-energy effective field theory to the first moments of generalized parton distributions and to baryon distribution amplitudes, which are both highly relevant for the parametrization of the nonperturbative part in hard processes. These quantities yield complementary information on hadron structure, since the former treat hadrons as a whole and, thus, give information about the (angular) momentum carried by an entire parton species on average, while the latter parametrize the momentum distribution within an individual Fock state. By performing one-loop calculations within covariant baryon chiral perturbation theory, we obtain sensible parametrizations of the quark mass dependence that are ideally suited for the subsequent analysis of lattice QCD data.

  16. A Pareto Optimal Auction Mechanism for Carbon Emission Rights

    Directory of Open Access Journals (Sweden)

    Mingxi Wang

    2014-01-01

    Full Text Available The carbon emission rights do not fit well into the framework of existing multi-item auction mechanisms because of their own unique features. This paper proposes a new auction mechanism which converges to a unique Pareto optimal equilibrium in a finite number of periods. In the proposed auction mechanism, the assignment outcome is Pareto efficient and the carbon emission rights’ resources are efficiently used. For commercial application and theoretical completeness, both discrete and continuous markets—represented by discrete and continuous bid prices, respectively—are examined, and the results show the existence of a Pareto optimal equilibrium under the constraint of individual rationality. With no ties, the Pareto optimal equilibrium can be further proven to be unique.

  17. Variational principle for the Pareto power law.

    Science.gov (United States)

    Chakraborti, Anirban; Patriarca, Marco

    2009-11-27

    A mechanism is proposed for the appearance of power-law distributions in various complex systems. It is shown that in a conservative mechanical system composed of subsystems with different numbers of degrees of freedom a robust power-law tail can appear in the equilibrium distribution of energy as a result of certain superpositions of the canonical equilibrium energy densities of the subsystems. The derivation only uses a variational principle based on the Boltzmann entropy, without assumptions outside the framework of canonical equilibrium statistical mechanics. Two examples are discussed, free diffusion on a complex network and a kinetic model of wealth exchange. The mechanism is illustrated in the general case through an exactly solvable mechanical model of a dimensionally heterogeneous system.

  18. The geometry of the Pareto front in biological phenotype space

    Science.gov (United States)

    Sheftel, Hila; Shoval, Oren; Mayo, Avi; Alon, Uri

    2013-01-01

    When organisms perform a single task, selection leads to phenotypes that maximize performance at that task. When organisms need to perform multiple tasks, a trade-off arises because no phenotype can optimize all tasks. Recent work addressed this question, and assumed that the performance at each task decays with distance in trait space from the best phenotype at that task. Under this assumption, the best-fitness solutions (termed the Pareto front) lie on simple low-dimensional shapes in trait space: line segments, triangles and other polygons. The vertices of these polygons are specialists at a single task. Here, we generalize this finding, by considering performance functions of general form, not necessarily functions that decay monotonically with distance from their peak. We find that, except for performance functions with highly eccentric contours, simple shapes in phenotype space are still found, but with mildly curving edges instead of straight ones. In a wide range of systems, complex data on multiple quantitative traits, which might be expected to fill a high-dimensional phenotype space, is predicted instead to collapse onto low-dimensional shapes; phenotypes near the vertices of these shapes are predicted to be specialists, and can thus suggest which tasks may be at play. PMID:23789060

  19. Nucleon-generalized parton distributions in the light-front quark model

    Indian Academy of Sciences (India)

    2016-01-12

    generalized parton distributions in the light-front quark model ... We calculate the generalized parton distributions (GPDs) for the up- and downquarks in nucleon using the effective light-front wavefunction. The results obtained for ...

  20. Phase transitions in Pareto optimal complex networks.

    Science.gov (United States)

    Seoane, Luís F; Solé, Ricard

    2015-09-01

    The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.

  1. Pareto-path multitask multiple kernel learning.

    Science.gov (United States)

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-01-01

    A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.

  2. Pareto-optimal multi-objective design of airplane control systems

    Science.gov (United States)

    Schy, A. A.; Johnson, K. G.; Giesy, D. P.

    1980-01-01

    A constrained minimization algorithm for the computer aided design of airplane control systems to meet many requirements over a set of flight conditions is generalized using the concept of Pareto-optimization. The new algorithm yields solutions on the boundary of the achievable domain in objective space in a single run, whereas the older method required a sequence of runs to approximate such a limiting solution. However, Pareto-optimality does not guarantee a satisfactory design, since such solutions may emphasize some objectives at the expense of others. The designer must still interact with the program to obtain a well-balanced set of objectives. Using the example of a fighter lateral stability augmentation system (SAS) design over five flight conditions, several effective techniques are developed for obtaining well-balanced Pareto-optimal solutions. For comparison, one of these techniques is also used in a recently developed algorithm of Kreisselmeier and Steinhauser, which replaces the hard constraints with soft constraints, using a special penalty function. It is shown that comparable results can be obtained.

  3. Classification as clustering: a Pareto cooperative-competitive GP approach.

    Science.gov (United States)

    McIntyre, Andrew R; Heywood, Malcolm I

    2011-01-01

    Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.

  4. Vegetation patchiness: Pareto statistics, cluster dynamics and desertification.

    Science.gov (United States)

    Shnerb, N. M.

    2009-04-01

    Recent studies [1-4] of cluster distribution of vegetation in the dryland revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this self-organized criticality is a manifestation of the law of proportion effec: mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (like desertification) manifest themselves in a drastic change of the stability properties of spatial colonies, as the chance of a cluster to disappear depends logarithmically, rather than linearly, on its size. [1] Scanlon et. al., Nature 449, 209212 [2007]. [2] Kefi et. al., Nature 449, 213217 [2007]. [3] Sole R., Nature 449, p. 151 [2007]. [4] Vandermeer et. al., Nature 451, p. 457 [2008].

  5. Overview of contaminant arrival distributions as general evaluation requirements

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The environmental consequences of subsurface contamination problems can be completely and effectively evaluated by fulfilling the following five requirements: Determine each present or future outflow boundary of contaminated groundwater; provide the location/arrival-time distributions; provide the location/outflow-quantity distributions; provide these distributions for each individual chemical or biological constituent of environmental importance; and use the arrival distributions to determine the quantity and concentration of each contaminant that will interface with the environment as time passes. The arrival distributions on which these requirements are based provide a reference point for communication among scientists and public decision makers by enabling complicated scientific analyses to be presented as simple summary relationships

  6. Pareto-optimal estimates that constrain mean California precipitation change

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-12-01

    Global climate model (GCM) projections of greenhouse gas-induced precipitation change can exhibit notable uncertainty at the regional scale, particularly in regions where the mean change is small compared to internal variability. This is especially true for California, which is located in a transition zone between robust precipitation increases to the north and decreases to the south, and where GCMs from the Climate Model Intercomparison Project phase 5 (CMIP5) archive show no consensus on mean change (in either magnitude or sign) across the central and southern parts of the state. With the goal of constraining this uncertainty, we apply a multiobjective approach to a large set of subensembles (subsets of models from the full CMIP5 ensemble). These constraints are based on subensemble performance in three fields important to California precipitation: tropical Pacific sea surface temperatures, upper-level zonal winds in the midlatitude Pacific, and precipitation over the state. An evolutionary algorithm is used to sort through and identify the set of Pareto-optimal subensembles across these three measures in the historical climatology, and we use this information to constrain end-of-century California wet season precipitation change. This technique narrows the range of projections throughout the state and increases confidence in estimates of positive mean change. Furthermore, these methods complement and generalize emergent constraint approaches that aim to restrict uncertainty in end-of-century projections, and they have applications to even broader aspects of uncertainty quantification, including parameter sensitivity and model calibration.

  7. The Urbanik generalized convolutions in the non-commutative ...

    Indian Academy of Sciences (India)

    −sν(dx) < ∞. Now we apply this construction to the Kendall convolution case, starting with the weakly stable measure δ1. Example 1. Let △ be the Kendall convolution, i.e. the generalized convolution with the probability kernel: δ1△δa = (1 − a)δ1 + aπ2 for a ∈ [0, 1] and π2 be the Pareto distribution with the density π2(dx) =.

  8. A generalization information management system applied to electrical distribution

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, K.I.; Neumann, S.A.; Nielsen, T.D.; Bower, P.K. (Empros Systems International (US)); Hughes, B.A.

    1990-07-01

    This article presents a system solution approach that meets the requirements being imposed by industry trends and the electric utility customer. Specifically, the solution addresses electric distribution management systems. Electrical distribution management is a particularly well suited area of application because it involves a high diversity of tasks, which are currently supported by a proliferation of automated islands. Islands of automation which currently exist include (among others) distribution operations, load management, automated mapping, facility management, work order processing, and planning.

  9. On the limit distribution of lower extreme generalized order statistics

    Indian Academy of Sciences (India)

    m−gOs (as well as the classical extreme value theory of ordinary order statistics) yields three types of limit distributions that are possible in case of linear normalization. In this paper a similar classification of limit distributions holds for extreme gOs, where the parameters γj , j = 1,..., n, are assumed to be pairwise different.

  10. Statement of Problem of Pareto Frontier Management and Its Solution in the Analysis and Synthesis of Optimal Systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2015-01-01

    Full Text Available The article research concerns the multi-criteria optimization (MCO, which assumes that operation quality criteria of the system are independent and specifies a way to improve values of these criteria. Mutual contradiction of some criteria is a major problem in MCO. One of the most important areas of research is to obtain the so-called Pareto - optimal options.The subject of research is Pareto front, also called the Pareto frontier. The article discusses front classifications by its geometric representation for the case of two-criterion task. It presents a mathematical description of the front characteristics using the gradients and their projections. A review of current domestic and foreign literature has revealed that the aim of works in constructing the Pareto frontier is to conduct research in conditions of uncertainty, in the stochastic statement, with no restrictions. A topology both in two- and in three-dimensional case is under consideration. The targets of modern applications are multi-agent systems and groups of players in differential games. However, all considered works have no task to provide an active management of the front.The objective of this article is to discuss the research problem the Pareto frontier in a new production, namely, with the active co-developers of the systems and (or the decision makers (DM in the management of the Pareto frontier. It notes that such formulation differs from the traditionally accepted approach based on the analysis of already existing solutions.The article discusses three ways to describe a quality of the object management system. The first way is to use the direct quality criteria for the model of a closed system as the vibrational level of the General form. The second one is to study a specific two-loop system of an aircraft control using the angular velocity and normal acceleration loops. The third is the use of the integrated quality criteria. In all three cases, the selected criteria are

  11. A Generalization of the Skew-Normal Distribution: The Beta Skew-Normal

    OpenAIRE

    Mameli, Valentina; Musio, Monica

    2011-01-01

    The aim of this article is to introduce a new family of distributions, which generalizes the skew normal distribution (SN). This new family, called Beta skew-normal (BSN), arises naturally when we consider the distributions of order statistics of the SN. The BSN can also be obtained as a special case of the Beta generated distribution (Jones (2004)). In this work we pay attention to three other generalizations of the SN distribution: the Balakrishnan skew-normal (SNB) (Balakrishnan (2002), as...

  12. Birds shed RNA-viruses according to the pareto principle.

    Science.gov (United States)

    Jankowski, Mark D; Williams, Christopher J; Fair, Jeanne M; Owen, Jennifer C

    2013-01-01

    A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian) - pathogen (RNA-virus) studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality) was 0.687 (0.036 SEM), and that 22.0% (0.90 SEM) of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.

  13. Birds shed RNA-viruses according to the pareto principle.

    Directory of Open Access Journals (Sweden)

    Mark D Jankowski

    Full Text Available A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian - pathogen (RNA-virus studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality was 0.687 (0.036 SEM, and that 22.0% (0.90 SEM of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.

  14. 26 CFR 1.316-2 - Sources of distribution in general.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Sources of distribution in general. 1.316-2... TAX (CONTINUED) INCOME TAXES Definitions; Constructive Ownership of Stock § 1.316-2 Sources of distribution in general. (a) For the purpose of income taxation every distribution made by a corporation is...

  15. 26 CFR 1.661(a)-1 - Estates and trusts accumulating income or distributing corpus; general.

    Science.gov (United States)

    2010-04-01

    ... distributing corpus; general. 1.661(a)-1 Section 1.661(a)-1 Internal Revenue INTERNAL REVENUE SERVICE... Accumulate Income Or Which Distribute Corpus § 1.661(a)-1 Estates and trusts accumulating income or distributing corpus; general. Subpart C, part I, subchapter J, chapter 1 of the Code, is applicable to all...

  16. Can we reach Pareto optimal outcomes using bottom-up approaches?

    NARCIS (Netherlands)

    V. Sanchez-Anguix (Victor); R. Aydoğan (Reyhan); T. Baarslag (Tim); C.M. Jonker (Catholijn)

    2016-01-01

    textabstractClassically, disciplines like negotiation and decision making have focused on reaching Pareto optimal solutions due to its stability and efficiency properties. Despite the fact that many practical and theoretical algorithms have successfully attempted to provide Pareto optimal solutions,

  17. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  18. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning.

    Science.gov (United States)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-09-01

    In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows promise in optimizing the number

  19. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning

    International Nuclear Information System (INIS)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-01-01

    Purpose: In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. Methods: pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. Results: pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows

  20. General scaling in $p^T/M$ distributions

    CERN Document Server

    Sampaio Alves, Catarina

    2017-01-01

    This report presents an overview of the work done during the CERN Summer Student Programme 2017 through the period from 26/6 to 01/09. During this time, I worked in a project where I searched for similarities between different particles and energies in distributions of cross section as a function of $p^T/M$ .

  1. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  2. Moments of nucleon spin-dependent generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Wolfram Schroers; Richard Brower; Patrick Dreher; Robert Edwards; George Fleming; P. Hagler; Urs Heller; Thomas Lippert; John Negele; Andrew Pochinsky; Dru Renner; David Richards; Klaus Schilling

    2004-03-01

    We present a lattice measurement of the first two moments of the spin-dependent GPD H-tilde(x,xi,t). From these we obtain the axial coupling constant and the second moment of the spin-dependent forward parton distribution. The measurements are done in full QCD using Wilson fermions. In addition, we also present results from a first exploratory study of full QCD using Asqtad sea and domain-wall valence fermions.

  3. On the Limit Distribution of Lower Extreme Generalized Order Statistics

    Indian Academy of Sciences (India)

    In a wide subclass of generalized order statistics ( g O s ) , which contains most of the known and important models of ordered random variables, weak convergence of lower extremes are developed. A recent result of extreme value theory of m − g O s (as well as the classical extreme value theory of ordinary order statistics) ...

  4. Rank distributions: A panoramic macroscopic outlook

    Science.gov (United States)

    Eliazar, Iddo I.; Cohen, Morrel H.

    2014-01-01

    This paper presents a panoramic macroscopic outlook of rank distributions. We establish a general framework for the analysis of rank distributions, which classifies them into five macroscopic "socioeconomic" states: monarchy, oligarchy-feudalism, criticality, socialism-capitalism, and communism. Oligarchy-feudalism is shown to be characterized by discrete macroscopic rank distributions, and socialism-capitalism is shown to be characterized by continuous macroscopic size distributions. Criticality is a transition state between oligarchy-feudalism and socialism-capitalism, which can manifest allometric scaling with multifractal spectra. Monarchy and communism are extreme forms of oligarchy-feudalism and socialism-capitalism, respectively, in which the intrinsic randomness vanishes. The general framework is applied to three different models of rank distributions—top-down, bottom-up, and global—and unveils each model's macroscopic universality and versatility. The global model yields a macroscopic classification of the generalized Zipf law, an omnipresent form of rank distributions observed across the sciences. An amalgamation of the three models establishes a universal rank-distribution explanation for the macroscopic emergence of a prevalent class of continuous size distributions, ones governed by unimodal densities with both Pareto and inverse-Pareto power-law tails.

  5. 3-D Index Distribution for Generalized Optical Measurement

    Science.gov (United States)

    2016-12-01

    capabilities has been the continuous advances in focal plane technology . [1,2] Sometimes overlooked however, is the fact that all of these impressive...convenience and despite extraordinary progress in FPA technology , imaging optics and the associated OD activities haven’t really changed much over the...volume holography (VH) as a tool for the realization of generalized non-identity optical mappings. This task has focused on a quantitative

  6. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  7. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  8. Efficient approximation of black-box functions and Pareto sets

    NARCIS (Netherlands)

    Rennen, G.

    2009-01-01

    In the case of time-consuming simulation models or other so-called black-box functions, we determine a metamodel which approximates the relation between the input- and output-variables of the simulation model. To solve multi-objective optimization problems, we approximate the Pareto set, i.e. the

  9. Pareto optimality for nonlinear infinite dimensional control systems

    Directory of Open Access Journals (Sweden)

    Evgenios P. Avgerinos

    1990-01-01

    Full Text Available In this note we establish the existence of Pareto optimal solutions for nonlinear, infinite dimensional control systems with state dependent control constraints and an integral criterion taking values in a separable, reflexive Banach lattice. An example is also presented in detail. Our result extends earlier ones obtained by Cesari and Suryanarayana.

  10. Adaptive Pareto Set Estimation for Stochastic Mixed Variable Design Problems

    Science.gov (United States)

    2009-03-01

    ix I. Introduction ...1 ADAPTIVE PARETO SET ESTIMATION FOR STOCHASTIC MIXED VARIABLE DESIGN PROBLEMS I. Introduction 1.1. Problem Setting A scalar-valued function...Tenfelde-Podehl, D. (2003). Computation of Ideal and Nadir values and Implications for Their Use in MCDM methods. European Journal of Operational

  11. Bienestar social, óptimos de Pareto y equilibrios Walrasianos

    Directory of Open Access Journals (Sweden)

    Elvio Accinelli

    2008-01-01

    Full Text Available En este trabajo analizamos la relación entre óptimos de Pareto y bienestar social. Cada distribución óptima de Pareto está asociada al conjunto de pesos sociales de una función de utilidad social. Toda ponderación posible de los agentes de la economía está asociada con un conjunto no vacío de asignaciones factibles óptimas de Pareto. Modificar el bienestar social asociado a los óptimos de Pareto alcanzables por una economía implica modificar la estructura de pesos sociales. No obstante, dada una distribución de dotaciones iniciales, no toda distribución de pesos sociales tiene asociada una asignación de recursos factible que sea, a su vez, un equilibrio walrasiano. Esta posibilidad depende de la distribución de las dotaciones iniciales. No cualquier distribución forzosamente da como resultado un equilibrio correspondiente al nivel más alto posible de bienestar agregado. ¿Es posible alcanzar este bienestar de manera descentralizada y sin participación de un planeador central? La respuesta es que generalmente necesitaremos de una reasignación de las dotaciones sociales.

  12. Generalized fluctuation relation for power-law distributions.

    Science.gov (United States)

    Budini, Adrián A

    2012-07-01

    Strong violations of existing fluctuation theorems may arise in nonequilibrium steady states characterized by distributions with power-law tails. The ratio of the probabilities of positive and negative fluctuations of equal magnitude behaves in an anomalous nonmonotonic way [H. Touchette and E. G. D. Cohen, Phys. Rev. E 76, 020101(R) (2007)]. Here, we propose an alternative definition of fluctuation relation (FR) symmetry that, in the power-law regime, is characterized by a monotonic linear behavior. The proposal is consistent with a large deviationlike principle. As an example, we study the fluctuations of the work done on a dragged particle immersed in a complex environment able to induce power-law tails. When the environment is characterized by spatiotemporal temperature fluctuations, distributions arising in nonextensive statistical mechanics define the work statistics. In that situation, we find that the FR symmetry is solely defined by the average bath temperature. The case of a dragged particle subjected to a Lévy noise is also analyzed in detail.

  13. Skew Generalized Extreme Value Distribution: Probability Weighted Moments Estimation and Application to Block Maxima Procedure

    OpenAIRE

    Ribereau, Pierre; Masiello, Esterina; Naveau, Philippe

    2014-01-01

    International audience; Following the work of Azzalini ([2] and [3]) on the skew normal distribution, we propose an extension of the Generalized Extreme Value (GEV) distribution, the SGEV. This new distribution allows for a better t of maxima and can be interpreted as both the distribution of maxima when maxima are taken on dependent data and when maxima are taken over a random block size. We propose to estimate the parameters of the SGEV distribution via the Probability Weighted Moments meth...

  14. Distribution of scholarly publications among academic radiology departments.

    Science.gov (United States)

    Morelli, John N; Bokhari, Danial

    2013-03-01

    The aim of this study was to determine whether the distribution of publications among academic radiology departments in the United States is Gaussian (ie, the bell curve) or Paretian. The search affiliation feature of the PubMed database was used to search for publications in 3 general radiology journals with high Impact Factors, originating at radiology departments in the United States affiliated with residency training programs. The distribution of the number of publications among departments was examined using χ(2) test statistics to determine whether it followed a Pareto or a Gaussian distribution more closely. A total of 14,219 publications contributed since 1987 by faculty members in 163 departments with residency programs were available for assessment. The data acquired were more consistent with a Pareto (χ(2) = 80.4) than a Gaussian (χ(2) = 659.5) distribution. The mean number of publications for departments was 79.9 ± 146 (range, 0-943). The median number of publications was 16.5. The majority (>50%) of major radiology publications from academic departments with residency programs originated in Pareto rather than a normal distribution. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. A Regionalization Approach to select the final watershed parameter set among the Pareto solutions

    Science.gov (United States)

    Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.

    2017-12-01

    The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.

  16. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.

    Science.gov (United States)

    Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri

    2015-10-01

    When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.

  17. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space

    Science.gov (United States)

    Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri

    2015-01-01

    When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes—phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass. PMID:26465336

  18. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.

    Directory of Open Access Journals (Sweden)

    Pablo Szekely

    2015-10-01

    Full Text Available When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.

  19. Modeling fractal structure of city-size distributions using correlation functions.

    Science.gov (United States)

    Chen, Yanguang

    2011-01-01

    Zipf's law is one the most conspicuous empirical facts for cities, however, there is no convincing explanation for the scaling relation between rank and size and its scaling exponent. Using the idea from general fractals and scaling, I propose a dual competition hypothesis of city development to explain the value intervals and the special value, 1, of the power exponent. Zipf's law and Pareto's law can be mathematically transformed into one another, but represent different processes of urban evolution, respectively. Based on the Pareto distribution, a frequency correlation function can be constructed. By scaling analysis and multifractals spectrum, the parameter interval of Pareto exponent is derived as (0.5, 1]; Based on the Zipf distribution, a size correlation function can be built, and it is opposite to the first one. By the second correlation function and multifractals notion, the Pareto exponent interval is derived as [1, 2). Thus the process of urban evolution falls into two effects: one is the Pareto effect indicating city number increase (external complexity), and the other the Zipf effect indicating city size growth (internal complexity). Because of struggle of the two effects, the scaling exponent varies from 0.5 to 2; but if the two effects reach equilibrium with each other, the scaling exponent approaches 1. A series of mathematical experiments on hierarchical correlation are employed to verify the models and a conclusion can be drawn that if cities in a given region follow Zipf's law, the frequency and size correlations will follow the scaling law. This theory can be generalized to interpret the inverse power-law distributions in various fields of physical and social sciences.

  20. Ultrawide Bandwidth Receiver Based on a Multivariate Generalized Gaussian Distribution

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-04-01

    Multivariate generalized Gaussian density (MGGD) is used to approximate the multiple access interference (MAI) and additive white Gaussian noise in pulse-based ultrawide bandwidth (UWB) system. The MGGD probability density function (pdf) is shown to be a better approximation of a UWB system as compared to multivariate Gaussian, multivariate Laplacian and multivariate Gaussian-Laplacian mixture (GLM). The similarity between the simulated and the approximated pdf is measured with the help of modified Kullback-Leibler distance (KLD). It is also shown that MGGD has the smallest KLD as compared to Gaussian, Laplacian and GLM densities. A receiver based on the principles of minimum bit error rate is designed for the MGGD pdf. As the requirement is stringent, the adaptive implementation of the receiver is also carried out in this paper. Training sequence of the desired user is the only requirement when implementing the detector adaptively. © 2002-2012 IEEE.

  1. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gharari, Rahman [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi [Nuclear Engineering Dept, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)

    2016-10-15

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.

  2. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    International Nuclear Information System (INIS)

    Gharari, Rahman; Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi

    2016-01-01

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor

  3. Implementation of Strength Pareto Evolutionary Algorithm II in the Multiobjective Burnable Poison Placement Optimization of KWU Pressurized Water Reactor

    Directory of Open Access Journals (Sweden)

    Rahman Gharari

    2016-10-01

    Full Text Available In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II, is developed for the burnable poison placement (BPP optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (Keff for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.

  4. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    DEFF Research Database (Denmark)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David

    2008-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions...... constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics...... of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample...

  5. Effects of the financial crisis on the wealth distribution of Korea's companies

    Science.gov (United States)

    Lim, Kyuseong; Kim, Soo Yong; Swanson, Todd; Kim, Jooyun

    2017-02-01

    We investigated the distribution functions of Korea's top-rated companies during two financial crises. A power-law scaling for rank distribution, as well as cumulative probability distribution, was found and observed as a general pattern. Similar distributions can be shown in other studies of wealth and income distributions. In our study, the Pareto exponents designating the distribution differed before and after the crisis. The companies covered in this research are divided into two subgroups during a period when the subprime mortgage crisis occurred. Various industrial sectors of Korea's companies were found to respond differently during the two financial crises, especially the construction sector, financial sectors, and insurance groups.

  6. A Study of Fitting the Generalized Lambda Distribution to Solar Radiation Data.

    Science.gov (United States)

    Öztürk, A.; Dale, R. F.

    1982-07-01

    The increased interest in the climatology of solar radiation dictates a need for a distribution to fit daily solar radiation totals which tend to have negatively-skewed probability distributions. Even daily mean solar radiation for weekly periods tends to have non-normal distributions. The generalized lambda distribution, which includes a wide variety of curve shapes, is discussed for fitting these data. The underlying probability distribution is a generalization of the lambda distribution from three to four parameters. Using the weekly averages of daily solar radiation totals for each of 12 weeks during the growing season and daily totals for the week 5-11 July at West Lafayette, Indiana, it is shown that the generalized lambda distribution model fits the data well. Some results concerning percentiles and quantiles, parameter estimates, and goodness-of-fit tests are also discussed.

  7. Estimations of parameters in Pareto reliability model in the presence of masked data

    International Nuclear Information System (INIS)

    Sarhan, Ammar M.

    2003-01-01

    Estimations of parameters included in the individual distributions of the life times of system components in a series system are considered in this paper based on masked system life test data. We consider a series system of two independent components each has a Pareto distributed lifetime. The maximum likelihood and Bayes estimators for the parameters and the values of the reliability of the system's components at a specific time are obtained. Symmetrical triangular prior distributions are assumed for the unknown parameters to be estimated in obtaining the Bayes estimators of these parameters. Large simulation studies are done in order: (i) explain how one can utilize the theoretical results obtained; (ii) compare the maximum likelihood and Bayes estimates obtained of the underlying parameters; and (iii) study the influence of the masking level and the sample size on the accuracy of the estimates obtained

  8. Risk finance for catastrophe losses with Pareto-calibrated Lévy-stable severities.

    Science.gov (United States)

    Powers, Michael R; Powers, Thomas Y; Gao, Siwei

    2012-11-01

    For catastrophe losses, the conventional risk finance paradigm of enterprise risk management identifies transfer, as opposed to pooling or avoidance, as the preferred solution. However, this analysis does not necessarily account for differences between light- and heavy-tailed characteristics of loss portfolios. Of particular concern are the decreasing benefits of diversification (through pooling) as the tails of severity distributions become heavier. In the present article, we study a loss portfolio characterized by nonstochastic frequency and a class of Lévy-stable severity distributions calibrated to match the parameters of the Pareto II distribution. We then propose a conservative risk finance paradigm that can be used to prepare the firm for worst-case scenarios with regard to both (1) the firm's intrinsic sensitivity to risk and (2) the heaviness of the severity's tail. © 2012 Society for Risk Analysis.

  9. 46 CFR 113.25-8 - Distribution of general emergency alarm system feeders and branch circuits.

    Science.gov (United States)

    2010-10-01

    ... main vertical fire bulkheads, the general emergency alarm system must be arranged into vertical service... 46 Shipping 4 2010-10-01 2010-10-01 false Distribution of general emergency alarm system feeders... (CONTINUED) ELECTRICAL ENGINEERING COMMUNICATION AND ALARM SYSTEMS AND EQUIPMENT General Emergency Alarm...

  10. Pareto-depth for multiple-query image retrieval.

    Science.gov (United States)

    Hsiao, Ko-Jen; Calder, Jeff; Hero, Alfred O

    2015-02-01

    Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information. In this paper, we consider the content-based image retrieval problem for multiple query images corresponding to different image semantics. We propose a novel multiple-query information retrieval algorithm that combines the Pareto front method with efficient manifold ranking. We show that our proposed algorithm outperforms state of the art multiple-query retrieval algorithms on real-world image databases. We attribute this performance improvement to concavity properties of the Pareto fronts, and prove a theoretical result that characterizes the asymptotic concavity of the fronts.

  11. Decomposition and Simplification of Multivariate Data using Pareto Sets.

    Science.gov (United States)

    Huettenberger, Lars; Heine, Christian; Garth, Christoph

    2014-12-01

    Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

  12. Pareto Distance for Multi-layer Network Analysis

    DEFF Research Database (Denmark)

    Magnani, Matteo; Rossi, Luca

    2013-01-01

    services, e.g., Facebook, Twitter, LinkedIn and Foursquare. As a result, the analysis of on-line social networks requires a wider scope and, more technically speaking, models for the representation of this fragmented scenario. The recent introduction of more realistic layered models has however determined...... new research problems related to the extension of traditional single-layer network measures. In this paper we take a step forward over existing approaches by defining a new concept of geodesic distance that includes heterogeneous networks and connections with very limited assumptions regarding...... the strength of the connections. This is achieved by exploiting the concept of Pareto efficiency to define a simple and at the same time powerful measure that we call Pareto distance, of which geodesic distance is a particular case when a single layer (or network) is analyzed. The limited assumptions...

  13. Small Sample Robust Testing for Normality against Pareto Tails

    Czech Academy of Sciences Publication Activity Database

    Stehlík, M.; Fabián, Zdeněk; Střelec, L.

    2012-01-01

    Roč. 41, č. 7 (2012), s. 1167-1194 ISSN 0361-0918 Grant - others:Aktion(CZ-AT) 51p7, 54p21, 50p14, 54p13 Institutional research plan: CEZ:AV0Z10300504 Keywords : consistency * Hill estimator * t-Hill estimator * location functional * Pareto tail * power comparison * returns * robust tests for normality Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.295, year: 2012

  14. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  15. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    International Nuclear Information System (INIS)

    Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok

    2014-01-01

    Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy

  16. Computing gap free Pareto front approximations with stochastic search algorithms.

    Science.gov (United States)

    Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali

    2010-01-01

    Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.

  17. Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach

    Science.gov (United States)

    Huang, Adam; Li, Jiang; Summers, Ronald M.; Petrick, Nicholas; Hara, Amy K.

    2010-01-01

    We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (palgorithm than by the one-step for 63% of all possible operating points. While operating at a suitable sensitivity level such as 90.8% (79/87) or 88.5% (77/87), the false positive rate was reduced by 24.4% (95% confidence intervals 17.9–31.0%) or 45.8% (95% confidence intervals 40.1–51.0%) respectively. We demonstrated that, with a proper experimental design, the Pareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms. PMID:20548966

  18. Orthogonality of the Mean and Error Distribution in Generalized Linear Models.

    Science.gov (United States)

    Huang, Alan; Rathouz, Paul J

    2017-01-01

    We show that the mean-model parameter is always orthogonal to the error distribution in generalized linear models. Thus, the maximum likelihood estimator of the mean-model parameter will be asymptotically efficient regardless of whether the error distribution is known completely, known up to a finite vector of parameters, or left completely unspecified, in which case the likelihood is taken to be an appropriate semiparametric likelihood. Moreover, the maximum likelihood estimator of the mean-model parameter will be asymptotically independent of the maximum likelihood estimator of the error distribution. This generalizes some well-known results for the special cases of normal, gamma and multinomial regression models, and, perhaps more interestingly, suggests that asymptotically efficient estimation and inferences can always be obtained if the error distribution is nonparametrically estimated along with the mean. In contrast, estimation and inferences using misspecified error distributions or variance functions are generally not efficient.

  19. 78 FR 37713 - General Regulations; National Park System, Demonstrations, Sale or Distribution of Printed Matter

    Science.gov (United States)

    2013-06-24

    ...] RIN 1024-AD91 General Regulations; National Park System, Demonstrations, Sale or Distribution of... printed matter applicable to most units of the National Park System. The rule clarifies provisions regarding permits for demonstrations or distributing printed matter and in management of two or more small...

  20. A generalized CAPM model with asymmetric power distributed errors with an application to portfolio construction

    NARCIS (Netherlands)

    Bao, T.; Diks, C.; Li, H.

    We estimate the CAPM model on European stock market data, allowing for asymmetric and fat-tailed return distributions using independent and identically asymmetric power distributed (IIAPD) innovations. The results indicate that the generalized CAPM with IIAPD errors has desirable properties. It is

  1. ML-Estimation in the Location-Scale-Shape Model of the Generalized Logistic Distribution

    OpenAIRE

    Abberger, Klaus

    2002-01-01

    A three parameter (location, scale, shape) generalization of the logistic distribution is fitted to data. Local maximum likelihood estimators of the parameters are derived. Although the likelihood function is unbounded, the likelihood equations have a consistent root. ML-estimation combined with the ECM algorithm allows the distribution to be easily fitted to data.

  2. Explicit expressions for European option pricing under a generalized skew normal distribution

    OpenAIRE

    Doostparast, Mahdi

    2017-01-01

    Under a generalized skew normal distribution we consider the problem of European option pricing. Existence of the martingale measure is proved. An explicit expression for a given European option price is presented in terms of the cumulative distribution function of the univariate skew normal and the bivariate standard normal distributions. Some special cases are investigated in a greater detail. To carry out the sensitivity of the option price to the skew parameters, numerical methods are app...

  3. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    International Nuclear Information System (INIS)

    Ottosson, Rickard O.; Sjoestroem, David; Behrens, Claus F.; Karlsson, Anna; Engstroem, Per E.; Knoeoes, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head and neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered

  4. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques.

    Science.gov (United States)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David; Behrens, Claus F; Karlsson, Anna; Knöös, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered.

  5. A Novel Generalized Normal Distribution for Human Longevity and other Negatively Skewed Data

    Science.gov (United States)

    Robertson, Henry T.; Allison, David B.

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution. PMID:22623974

  6. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  7. Distributional Assumptions in Educational Assessments Analysis: Normal Distributions versus Generalized Beta Distribution in Modeling the Phenomenon of Learning

    Science.gov (United States)

    Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

    2013-01-01

    This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

  8. Pareto-optimal electricity tariff rates in the Republic of Armenia

    International Nuclear Information System (INIS)

    Kaiser, M.J.

    2000-01-01

    The economic impact of electricity tariff rates on the residential sector of Yerevan, Armenia, is examined. The effect of tariff design on revenue generation and equity measures is considered, and the combination of energy pricing and compensatory social policies which provides the best mix of efficiency and protection for poor households is examined. An equity measure is defined in terms of a cumulative distribution function which describes the percent of the population that spends x percent or less of their income on electricity consumption. An optimal (Pareto-efficient) tariff is designed based on the analysis of survey data and an econometric model, and the Armenian tariff rate effective 1 January 1997 to 15 September 1997 is shown to be non-optimal relative to this rate. 22 refs

  9. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    International Nuclear Information System (INIS)

    Meenachi, N. Madurai; Baba, M. Sai

    2017-01-01

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  10. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    Energy Technology Data Exchange (ETDEWEB)

    Meenachi, N. Madurai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Planning and Human Resource Management Div.; Baba, M. Sai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Resources Management Group

    2017-12-15

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  11. Regular distributive efficiency and the distributive liberal social contract.

    OpenAIRE

    Jean Mercier Ythier

    2009-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is, both, Pareto-efficient relative to individual interdependent preferences, and unanimously we...

  12. Competition and fragmentation: a simple model generating lognormal-like distributions

    International Nuclear Information System (INIS)

    Schwaemmle, V; Queiros, S M D; Brigatti, E; Tchumatchenko, T

    2009-01-01

    The current distribution of language size in terms of speaker population is generally described using a lognormal distribution. Analyzing the original real data we show how the double-Pareto lognormal distribution can give an alternative fit that indicates the existence of a power law tail. A simple Monte Carlo model is constructed based on the processes of competition and fragmentation. The results reproduce the power law tails of the real distribution well and give better results for a poorly connected topology of interactions.

  13. Pareto-Optimal Model Selection via SPRINT-Race.

    Science.gov (United States)

    Zhang, Tiantian; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2018-02-01

    In machine learning, the notion of multi-objective model selection (MOMS) refers to the problem of identifying the set of Pareto-optimal models that optimize by compromising more than one predefined objectives simultaneously. This paper introduces SPRINT-Race, the first multi-objective racing algorithm in a fixed-confidence setting, which is based on the sequential probability ratio with indifference zone test. SPRINT-Race addresses the problem of MOMS with multiple stochastic optimization objectives in the proper Pareto-optimality sense. In SPRINT-Race, a pairwise dominance or non-dominance relationship is statistically inferred via a non-parametric, ternary-decision, dual-sequential probability ratio test. The overall probability of falsely eliminating any Pareto-optimal models or mistakenly returning any clearly dominated models is strictly controlled by a sequential Holm's step-down family-wise error rate control method. As a fixed-confidence model selection algorithm, the objective of SPRINT-Race is to minimize the computational effort required to achieve a prescribed confidence level about the quality of the returned models. The performance of SPRINT-Race is first examined via an artificially constructed MOMS problem with known ground truth. Subsequently, SPRINT-Race is applied on two real-world applications: 1) hybrid recommender system design and 2) multi-criteria stock selection. The experimental results verify that SPRINT-Race is an effective and efficient tool for such MOMS problems. code of SPRINT-Race is available at https://github.com/watera427/SPRINT-Race.

  14. Moments of generalized Husimi distributions and complexity of many-body quantum states

    International Nuclear Information System (INIS)

    Sugita, Ayumu

    2003-01-01

    We consider generalized Husimi distributions for many-body systems, and show that their moments are good measures of complexity of many-body quantum states. Our construction of the Husimi distribution is based on the coherent state of the single-particle transformation group. Then the coherent states are independent-particle states, and, at the same time, the most localized states in the Husimi representation. Therefore delocalization of the Husimi distribution, which can be measured by the moments, is a sign of many-body correlation (entanglement). Since the delocalization of the Husimi distribution is also related to chaoticity of the dynamics, it suggests a relation between entanglement and chaos. Our definition of the Husimi distribution can be applied not only to systems of distinguishable particles, but also to those of identical particles, i.e., fermions and bosons. We derive an algebraic formula to evaluate the moments of the Husimi distribution

  15. Application of Generalized Student’s T-Distribution In Modeling The Distribution of Empirical Return Rates on Selected Stock Exchange Indexes

    Directory of Open Access Journals (Sweden)

    Purczyńskiz Jan

    2014-07-01

    Full Text Available This paper examines the application of the so called generalized Student’s t-distribution in modeling the distribution of empirical return rates on selected Warsaw stock exchange indexes. It deals with distribution parameters by means of the method of logarithmic moments, the maximum likelihood method and the method of moments. Generalized Student’s t-distribution ensures better fitting to empirical data than the classical Student’s t-distribution.

  16. Pareto front–based multi-objective real-time traffic signal control model for intersections using particle swarm optimization algorithm

    Directory of Open Access Journals (Sweden)

    Pengpeng Jiao

    2016-08-01

    Full Text Available Real-time traffic control is very important for urban transportation systems. Due to conflicts among different optimization objectives, the existing multi-objective models often convert into single-objective problems through weighted sum method. To obtain real-time signal parameters and evaluation indices, this article puts forward a Pareto front–based multi-objective traffic signal control model using particle swarm optimization algorithm. The article first formulates a control model for intersections based on detected real-time link volumes, with minimum delay time, minimum number of stops, and maximum effective capacity as three objectives. Moreover, this article designs a step-by-step particle swarm optimization algorithm based on Pareto front for solution. Pareto dominance relation and density distance are employed for ranking, tournament selection is used to select and weed out particles, and Pareto front for the signal timing plan is then obtained, including time-varying cycle length and split. Finally, based on actual survey data, scenario analyses determine the optimal parameters of the particle swarm algorithm, comparisons with the current situation and existing models demonstrate the excellent performances, and the experiments incorporating outliers in the input data or total failure of detectors further prove the robustness. Generally, the proposed methodology is effective and robust enough for real-time traffic signal control.

  17. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  18. Generalized Huberman-Rudnick scaling law and robustness of q-Gaussian probability distributions

    Science.gov (United States)

    Afsar, Ozgur; Tirnakli, Ugur

    2013-01-01

    We generalize Huberman-Rudnick universal scaling law for all periodic windows of the logistic map and show the robustness of q-Gaussian probability distributions in the vicinity of chaos threshold. Our scaling relation is universal for the self-similar windows of the map which exhibit period-doubling subharmonic bifurcations. Using this generalized scaling argument, for all periodic windows, as chaos threshold is approached, a developing convergence to q-Gaussian is numerically obtained both in the central regions and tails of the probability distributions of sums of iterates.

  19. Self-organization property of Kohonen's map with general type of stimuli distribution.

    Science.gov (United States)

    Sadeghi, Ali A.

    1998-12-01

    Here the self-organization property of one-dimensional Kohonen's algorithm in its 2k-neighbor setting with a general type of stimuli distribution and non-increasing learning rate is considered. A new definition of the winner is given, which coincides with the usual definition in implementations of the algorithm. We prove that the probability of self-organization for all initial weights of neurons is uniformly positive. For the special case of a constant learning rate, it implies that the algorithm self-organizes with probability one. The conditions imposed on the neighborhood function, stimuli distribution and learning rate are quite general.

  20. Novel formulation of the ℳ model through the Generalized-K distribution for atmospheric optical channels.

    Science.gov (United States)

    Garrido-Balsells, José María; Jurado-Navas, Antonio; Paris, José Francisco; Castillo-Vazquez, Miguel; Puerta-Notario, Antonio

    2015-03-09

    In this paper, a novel and deeper physical interpretation on the recently published Málaga or ℳ statistical distribution is provided. This distribution, which is having a wide acceptance by the scientific community, models the optical irradiance scintillation induced by the atmospheric turbulence. Here, the analytical expressions previously published are modified in order to express them by a mixture of the known Generalized-K and discrete Binomial and Negative Binomial distributions. In particular, the probability density function (pdf) of the ℳ model is now obtained as a linear combination of these Generalized-K pdf, in which the coefficients depend directly on the parameters of the ℳ distribution. In this way, the Málaga model can be physically interpreted as a superposition of different optical sub-channels each of them described by the corresponding Generalized-K fading model and weighted by the ℳ dependent coefficients. The expressions here proposed are simpler than the equations of the original ℳ model and are validated by means of numerical simulations by generating ℳ -distributed random sequences and their associated histogram. This novel interpretation of the Málaga statistical distribution provides a valuable tool for analyzing the performance of atmospheric optical channels for every turbulence condition.

  1. Feynman quasi probability distribution for spin-(1/2), and its generalizations

    International Nuclear Information System (INIS)

    Colucci, M.

    1999-01-01

    It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects

  2. Exclusive neutrino production of a charmed vector meson and transversity gluon generalized parton distributions

    Science.gov (United States)

    Pire, B.; Szymanowski, L.

    2017-12-01

    We calculate at the leading order in αs the QCD amplitude for exclusive neutrino production of a D* or Ds* charmed vector meson on a nucleon. We work in the framework of the collinear QCD approach where generalized parton distributions (GPDs) factorize from perturbatively calculable coefficient functions. We include O (mc) terms in the coefficient functions and the O (mD) term in the definition of heavy meson distribution amplitudes. The show that the analysis of the angular distribution of the decay D(s) *→D(s )π allows us to access the transversity gluon GPDs.

  3. A new model for describing remission times: the generalized beta-generated Lindley distribution

    Directory of Open Access Journals (Sweden)

    MARIA DO CARMO S. LIMA

    Full Text Available New generators are required to define wider distributions for modeling real data in survival analysis. To that end we introduce the four-parameter generalized beta-generated Lindley distribution. It has explicit expressions for the ordinary and incomplete moments, mean deviations, generating and quantile functions. We propose a maximum likelihood procedure to estimate the model parameters, which is assessed through a Monte Carlo simulation study. We also derive an additional estimation scheme by means of least square between percentiles. The usefulness of the proposed distribution to describe remission times of cancer patients is illustrated by means of an application to real data.

  4. Spatial redistribution of irregularly-spaced Pareto fronts for more intuitive navigation and solution selection

    NARCIS (Netherlands)

    A. Bouter (Anton); K. Pirpinia (Kleopatra); T. Alderliesten (Tanja); P.A.N. Bosman (Peter)

    2017-01-01

    textabstractA multi-objective optimization approach is o.en followed by an a posteriori decision-making process, during which the most appropriate solution of the Pareto set is selected by a professional in the .eld. Conventional visualization methods do not correct for Pareto fronts with

  5. Necessary and sufficient conditions for a Pareto optimal allocation in a discontinuous Gale economic model

    Directory of Open Access Journals (Sweden)

    Anna Michalak

    2014-01-01

    Full Text Available In this paper we examine the concept of Pareto optimality in a simplified Gale economic model without assuming continuity of the utility functions. We apply some existing results on higher-order optimality conditions to get necessary and sufficient conditions for a locally Pareto optimal allocation.

  6. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    Energy Technology Data Exchange (ETDEWEB)

    Chhaiba, Hassan, E-mail: chhaiba.hassan@gmail.com [Department of Mathematics, Faculty of Sciences, Ibn Tofail University, P.O. Box 133, Kénitra (Morocco); Demni, Nizar, E-mail: nizar.demni@univ-rennes1.fr [IRMAR, Université de Rennes 1, Campus de Beaulieu, 35042 Rennes Cedex (France); Mouayn, Zouhair, E-mail: mouayn@fstbm.ac.ma [Department of Mathematics, Faculty of Sciences and Technics (M’Ghila), Sultan Moulay Slimane, P.O. Box 523, Béni Mellal (Morocco)

    2016-07-15

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  7. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    International Nuclear Information System (INIS)

    Chhaiba, Hassan; Demni, Nizar; Mouayn, Zouhair

    2016-01-01

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  8. Evolutionary tradeoffs, Pareto optimality and the morphology of ammonite shells.

    Science.gov (United States)

    Tendler, Avichai; Mayo, Avraham; Alon, Uri

    2015-03-07

    Organisms that need to perform multiple tasks face a fundamental tradeoff: no design can be optimal at all tasks at once. Recent theory based on Pareto optimality showed that such tradeoffs lead to a highly defined range of phenotypes, which lie in low-dimensional polyhedra in the space of traits. The vertices of these polyhedra are called archetypes- the phenotypes that are optimal at a single task. To rigorously test this theory requires measurements of thousands of species over hundreds of millions of years of evolution. Ammonoid fossil shells provide an excellent model system for this purpose. Ammonoids have a well-defined geometry that can be parameterized using three dimensionless features of their logarithmic-spiral-shaped shells. Their evolutionary history includes repeated mass extinctions. We find that ammonoids fill out a pyramid in morphospace, suggesting five specific tasks - one for each vertex of the pyramid. After mass extinctions, surviving species evolve to refill essentially the same pyramid, suggesting that the tasks are unchanging. We infer putative tasks for each archetype, related to economy of shell material, rapid shell growth, hydrodynamics and compactness. These results support Pareto optimality theory as an approach to study evolutionary tradeoffs, and demonstrate how this approach can be used to infer the putative tasks that may shape the natural selection of phenotypes.

  9. Energy distributions of Bianchi type-VIh Universe in general relativity ...

    Indian Academy of Sciences (India)

    2017-03-16

    Mar 16, 2017 ... Energy distributions of Bianchi type-VIh Universe in general relativity and teleparallel gravity. S. EREF ÖZKURT1 and SEZG˙IN AYGÜN2,3,∗. 1Institute for Natural and Applied Sciences, Çanakkale Onsekiz Mart University, 17020, Çanakkale, Turkey. 2Astrophysics Research Center, Çanakkale Onsekiz ...

  10. Environmental Assessment for Proposed General Purpose Warehouse Construction at Defense Distribution Officer Oklahoma City, Oklahoma (DDOO)

    Science.gov (United States)

    2008-05-01

    ENVIRONMENTAL ASSESMENT for PROPOSED GENERAL PURPOSE WAREHOUSE CONSTRUCTION at DEFENSE DISTRIBUTION OFFICE OKLAHOMA CITY...facility. There are no residences that might house children in close proximity to the proposed action Area, and no hazardous materials will be generated...or stored at the GPW facility. Consequently, implementation of the proposed action should not adversely impact children . (EA Section 4.7

  11. Distribution theory and transform analysis an introduction to generalized functions, with applications

    CERN Document Server

    Zemanian, AH

    2010-01-01

    This well-known text provides a relatively elementary introduction to distribution theory and describes generalized Fourier and Laplace transformations and their applications to integrodifferential equations, difference equations, and passive systems. Suitable for a graduate course for engineering and science students or for an advanced undergraduate course for mathematics majors. 1965 edition.

  12. Effect of a generalized particle momentum distribution on plasma nuclear fusion rates

    International Nuclear Information System (INIS)

    Kim, Yeong E.; Zubarev, Alexander L.

    2006-01-01

    We investigate the effect of a generalized particle momentum distribution derived by Galitskii and Yakimets (GY) on nuclear reaction rates in plasma. We derive an approximate semi-analytical formula for nuclear fusion reaction rate between nuclei in a plasma (quantum plasma nuclear fusion; or QPNF). The QPNF formula is applied to calculate deuteron-deuteron fusion rate in a plasma, and the results are compared with the results calculated with the conventional Maxwell-Boltzmann velocity distribution. As an application, we investigate the deuteron-deuteron fusion rate for mobile deuterons in a deuterated metal/alloy. The calculated deuteron-deuteron fusion rates at low energies are enormously enhanced due to the modified tail of the GY's generalized momentum distribution. Our preliminary estimates indicate also that the deuteron-lithium (D+Li) fusion rate and the proton-lithium (p+Li) fusion rate in a metal/alloy at ambient temperatures are also substantially enhanced. (author)

  13. Pion generalized parton distributions within a fully covariant constituent quark model

    Energy Technology Data Exchange (ETDEWEB)

    Fanelli, Cristiano [Massachusetts Institute of Technology, Cambridge, MA (United States). Lab. for Nuclear Science; Pace, Emanuele [' ' Tor Vergata' ' Univ., Rome (Italy). Physics Dept.; INFN Sezione di TorVergata, Rome (Italy); Romanelli, Giovanni [Rutherford-Appleton Laboratory, Didcot (United Kingdom). STFC; Salme, Giovanni [Istituto Nazionale di Fisica Nucleare, Rome (Italy); Salmistraro, Marco [Rome La Sapienza Univ. (Italy). Physics Dept.; I.I.S. G. De Sanctis, Rome (Italy)

    2016-05-15

    We extend the investigation of the generalized parton distribution for a charged pion within a fully covariant constituent quark model, in two respects: (1) calculating the tensor distribution and (2) adding the treatment of the evolution, needed for achieving a meaningful comparison with both the experimental parton distribution and the lattice evaluation of the so-called generalized form factors. Distinct features of our phenomenological covariant quark model are: (1) a 4D Ansatz for the pion Bethe-Salpeter amplitude, to be used in the Mandelstam formula for matrix elements of the relevant current operators, and (2) only two parameters, namely a quark mass assumed to be m{sub q} = 220 MeV and a free parameter fixed through the value of the pion decay constant. The possibility of increasing the dynamical content of our covariant constituent quark model is briefly discussed in the context of the Nakanishi integral representation of the Bethe-Salpeter amplitude. (orig.)

  14. Using the Pareto principle in genome-wide breeding value estimation.

    Science.gov (United States)

    Yu, Xijiang; Meuwissen, Theo H E

    2011-11-01

    Genome-wide breeding value (GWEBV) estimation methods can be classified based on the prior distribution assumptions of marker effects. Genome-wide BLUP methods assume a normal prior distribution for all markers with a constant variance, and are computationally fast. In Bayesian methods, more flexible prior distributions of SNP effects are applied that allow for very large SNP effects although most are small or even zero, but these prior distributions are often also computationally demanding as they rely on Monte Carlo Markov chain sampling. In this study, we adopted the Pareto principle to weight available marker loci, i.e., we consider that x% of the loci explain (100 - x)% of the total genetic variance. Assuming this principle, it is also possible to define the variances of the prior distribution of the 'big' and 'small' SNP. The relatively few large SNP explain a large proportion of the genetic variance and the majority of the SNP show small effects and explain a minor proportion of the genetic variance. We name this method MixP, where the prior distribution is a mixture of two normal distributions, i.e. one with a big variance and one with a small variance. Simulation results, using a real Norwegian Red cattle pedigree, show that MixP is at least as accurate as the other methods in all studied cases. This method also reduces the hyper-parameters of the prior distribution from 2 (proportion and variance of SNP with big effects) to 1 (proportion of SNP with big effects), assuming the overall genetic variance is known. The mixture of normal distribution prior made it possible to solve the equations iteratively, which greatly reduced computation loads by two orders of magnitude. In the era of marker density reaching million(s) and whole-genome sequence data, MixP provides a computationally feasible Bayesian method of analysis.

  15. An R Package for a General Class of Inverse Gaussian Distributions

    Directory of Open Access Journals (Sweden)

    Victor Leiva

    2007-03-01

    Full Text Available The inverse Gaussian distribution is a positively skewed probability model that has received great attention in the last 20 years. Recently, a family that generalizes this model called inverse Gaussian type distributions has been developed. The new R package named ig has been designed to analyze data from inverse Gaussian type distributions. This package contains basic probabilistic functions, lifetime indicators and a random number generator from this model. Also, parameter estimates and diagnostics analysis can be obtained using likelihood methods by means of this package. In addition, goodness-of-fit methods are implemented in order to detect the suitability of the model to the data. The capabilities and features of the ig package are illustrated using simulated and real data sets. Furthermore, some new results related to the inverse Gaussian type distribution are also obtained. Moreover, a simulation study is conducted for evaluating the estimation method implemented in the ig package.

  16. Income- and energy-taxation for redistribution in general equilibrium

    International Nuclear Information System (INIS)

    FitzRoy, F.R.

    1993-01-01

    In a 3-factor General Equilibrium (GE)-model with a continuum of ability, the employed choose optimal labour supply, and equilibrium unemployment is determined by benefits funded by wage- and energy-taxes. Aggregate labour and the net wage may increase or decrease with taxation (and unemployment), and conditions for a reduction in redistributive wage-taxes to be Pareto-improving are derived. A small energy tax always raises the net wage, providing the wage tax is reduced to maintain constant employment and a balanced budget. High ability households prefer higher energy taxes when externalities are uniformly distributed and non-distorting. (author)

  17. A Collaborative Neurodynamic Approach to Multiple-Objective Distributed Optimization.

    Science.gov (United States)

    Yang, Shaofu; Liu, Qingshan; Wang, Jun

    2018-04-01

    This paper is concerned with multiple-objective distributed optimization. Based on objective weighting and decision space decomposition, a collaborative neurodynamic approach to multiobjective distributed optimization is presented. In the approach, a system of collaborative neural networks is developed to search for Pareto optimal solutions, where each neural network is associated with one objective function and given constraints. Sufficient conditions are derived for ascertaining the convergence to a Pareto optimal solution of the collaborative neurodynamic system. In addition, it is proved that each connected subsystem can generate a Pareto optimal solution when the communication topology is disconnected. Then, a switching-topology-based method is proposed to compute multiple Pareto optimal solutions for discretized approximation of Pareto front. Finally, simulation results are discussed to substantiate the performance of the collaborative neurodynamic approach. A portfolio selection application is also given.

  18. Estimation of two-dimensional velocity distribution profile using General Index Entropy in open channels

    Science.gov (United States)

    Shojaeezadeh, Shahab Aldin; Amiri, Seyyed Mehrab

    2018-02-01

    Estimation of velocity distribution profile is a challenging subject of open channel hydraulics. In this study, an entropy-based method is used to derive two-dimensional velocity distribution profile. The General Index Entropy (GIE) can be considered as the generalized form of Shannon entropy which is suitable to combine with the different form of Cumulative Distribution Function (CDF). Using the principle of maximum entropy (POME), the velocity distribution is defined by maximizing the GIE by treating the velocity as a random variable. The combination of GIE and a CDF proposed by Marini et al. (2011) was utilized to introduce an efficient entropy model whose results are comparable with several well-known experimental and field data. Consequently, in spite of less sensitivity of the related parameters of the model to flow conditions and less complexity in application of the model compared with other entropy-based methods, more accuracy is obtained in estimating velocity distribution profile either near the boundaries or the free surface of the flow.

  19. Fluid limit of the continuous-time random walk with general Levy jump distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Cartea, A. [Birbeck College, University of London; Del-Castillo-Negrete, Diego B [ORNL

    2007-01-01

    The continuous time random walk (CTRW) is a natural generalization of the Brownian random walk that allows the incorporation of waiting time distributions psi(t) and general jump distribution functions eta(x). There are two well-known fluid limits of this model in the uncoupled case. For exponential decaying waiting times and Gaussian jump distribution functions the fluid limit leads to the diffusion equation. On the other hand, for algebraic decaying waiting times psi similar to t(-(1+beta)) and algebraic decaying jump distributions eta similar to x(-(1+alpha)) corresponding to Levy stable processes, the fluid limit leads to the fractional diffusion equation of order alpha in space and order beta in time. However, these are two special cases of a wider class of models. Here we consider the CTRW for the most general Levy stochastic processes in the Levy-Khintchine representation for the jump distribution function and obtain an integrodifferential equation describing the dynamics in the fluid limit. The resulting equation contains as special cases the regular and the fractional diffusion equations. As an application we consider the case of CTRWs with exponentially truncated Levy jump distribution functions. In this case the fluid limit leads to a transport equation with exponentially truncated fractional derivatives which describes the interplay between memory, long jumps, and truncation effects in the intermediate asymptotic regime. The dynamics exhibits a transition from superdiffusion to subdiffusion with the crossover time scaling as tau(c)similar to lambda(-alpha/beta), where 1/lambda is the truncation length scale. The asymptotic behavior of the propagator (Green's function) of the truncated fractional equation exhibits a transition from algebraic decay for t <>tau(c).

  20. Pareto optimization in computational protein design with multiple objectives.

    Science.gov (United States)

    Suárez, María; Tortosa, Pablo; Carrera, Javier; Jaramillo, Alfonso

    2008-12-01

    The optimization for function in computational design requires the treatment of, often competing, multiple objectives. Current algorithms reduce the problem to a single objective optimization problem, with the consequent loss of relevant solutions. We present a procedure, based on a variant of a Pareto algorithm, to optimize various competing objectives in protein design that allows reducing in several orders of magnitude the search of the solution space. Our methodology maintains the diversity of solutions and provides an iterative way to incorporate automatic design methods in the design of functional proteins. We have applied our systematic procedure to design enzymes optimized for both catalysis and stability. However, this methodology can be applied to any computational chemistry application requiring multi-objective combinatorial optimization techniques. 2008 Wiley Periodicals, Inc.

  1. Optimal PMU Placement with Uncertainty Using Pareto Method

    Directory of Open Access Journals (Sweden)

    A. Ketabi

    2012-01-01

    Full Text Available This paper proposes a method for optimal placement of Phasor Measurement Units (PMUs in state estimation considering uncertainty. State estimation has first been turned into an optimization exercise in which the objective function is selected to be the number of unobservable buses which is determined based on Singular Value Decomposition (SVD. For the normal condition, Differential Evolution (DE algorithm is used to find the optimal placement of PMUs. By considering uncertainty, a multiobjective optimization exercise is hence formulated. To achieve this, DE algorithm based on Pareto optimum method has been proposed here. The suggested strategy is applied on the IEEE 30-bus test system in several case studies to evaluate the optimal PMUs placement.

  2. Pareto analysis of critical factors affecting technical institution evaluation

    Directory of Open Access Journals (Sweden)

    Victor Gambhir

    2012-08-01

    Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.

  3. Pareto Efficient Solutions of Attack-Defence Trees

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming

    2015-01-01

    Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes......, such as probability or cost of attacks and defences. In case of multiple parameters most analytical methods optimise one parameter at a time, e.g., minimise cost or maximise probability of an attack. Such methods may lead to sub-optimal solutions when optimising conflicting parameters, e.g., minimising cost while...... maximising probability. In order to tackle this challenge, we devise automated techniques that optimise all parameters at once. Moreover, in the case of conflicting parameters our techniques compute the set of all optimal solutions, defined in terms of Pareto efficiency. The developments are carried out...

  4. Pareto optimization of an industrial ecosystem: sustainability maximization

    Directory of Open Access Journals (Sweden)

    J. G. M.-S. Monteiro

    2010-09-01

    Full Text Available This work investigates a procedure to design an Industrial Ecosystem for sequestrating CO2 and consuming glycerol in a Chemical Complex with 15 integrated processes. The Complex is responsible for the production of methanol, ethylene oxide, ammonia, urea, dimethyl carbonate, ethylene glycol, glycerol carbonate, β-carotene, 1,2-propanediol and olefins, and is simulated using UNISIM Design (Honeywell. The process environmental impact (EI is calculated using the Waste Reduction Algorithm, while Profit (P is estimated using classic cost correlations. MATLAB (The Mathworks Inc is connected to UNISIM to enable optimization. The objective is granting maximum process sustainability, which involves finding a compromise between high profitability and low environmental impact. Sustainability maximization is therefore understood as a multi-criteria optimization problem, addressed by means of the Pareto optimization methodology for trading off P vs. EI.

  5. Dictatorship, liberalism and the Pareto rule: Possible and impossible

    Directory of Open Access Journals (Sweden)

    Boričić Branislav

    2009-01-01

    Full Text Available The current economic crisis has shaken belief in the capacity of neoliberal 'free market' policies. Numerous supports of state intervention have arisen, and the interest for social choice theory has revived. In this paper we consider three standard properties for aggregating individual into social preferences: dictatorship, liberalism and the Pareto rule, and their formal negations. The context of the pure first-order classical logic makes it possible to show how some combinations of the above mentioned conditions, under the hypothesis of unrestricted domain, form simple and reasonable examples of possible or impossible social choice systems. Due to their simplicity, these examples, including the famous 'liberal paradox', could have a particular didactic value.

  6. Self-Intersection Local Times of Generalized Mixed Fractional Brownian Motion as White Noise Distributions

    International Nuclear Information System (INIS)

    Suryawan, Herry P.; Gunarso, Boby

    2017-01-01

    The generalized mixed fractional Brownian motion is defined by taking linear combinations of a finite number of independent fractional Brownian motions with different Hurst parameters. It is a Gaussian process with stationary increments, posseses self-similarity property, and, in general, is neither a Markov process nor a martingale. In this paper we study the generalized mixed fractional Brownian motion within white noise analysis framework. As a main result, we prove that for any spatial dimension and for arbitrary Hurst parameter the self-intersection local times of the generalized mixed fractional Brownian motions, after a suitable renormalization, are well-defined as Hida white noise distributions. The chaos expansions of the self-intersection local times in the terms of Wick powers of white noises are also presented. (paper)

  7. Self-Intersection Local Times of Generalized Mixed Fractional Brownian Motion as White Noise Distributions

    Science.gov (United States)

    Suryawan, Herry P.; Gunarso, Boby

    2017-06-01

    The generalized mixed fractional Brownian motion is defined by taking linear combinations of a finite number of independent fractional Brownian motions with different Hurst parameters. It is a Gaussian process with stationary increments, posseses self-similarity property, and, in general, is neither a Markov process nor a martingale. In this paper we study the generalized mixed fractional Brownian motion within white noise analysis framework. As a main result, we prove that for any spatial dimension and for arbitrary Hurst parameter the self-intersection local times of the generalized mixed fractional Brownian motions, after a suitable renormalization, are well-defined as Hida white noise distributions. The chaos expansions of the self-intersection local times in the terms of Wick powers of white noises are also presented.

  8. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    International Nuclear Information System (INIS)

    Hoffmann, Aswin L; Siem, Alex Y D; Hertog, Dick den; Kaanders, Johannes H A M; Huizenga, Henk

    2006-01-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning

  9. The Evolutionary Algorithm to Find Robust Pareto-Optimal Solutions over Time

    Directory of Open Access Journals (Sweden)

    Meirong Chen

    2015-01-01

    Full Text Available In dynamic multiobjective optimization problems, the environmental parameters change over time, which makes the true pareto fronts shifted. So far, most works of research on dynamic multiobjective optimization methods have concentrated on detecting the changed environment and triggering the population based optimization methods so as to track the moving pareto fronts over time. Yet, in many real-world applications, it is not necessary to find the optimal nondominant solutions in each dynamic environment. To solve this weakness, a novel method called robust pareto-optimal solution over time is proposed. It is in fact to replace the optimal pareto front at each time-varying moment with the series of robust pareto-optimal solutions. This means that each robust solution can fit for more than one time-varying moment. Two metrics, including the average survival time and average robust generational distance, are present to measure the robustness of the robust pareto solution set. Another contribution is to construct the algorithm framework searching for robust pareto-optimal solutions over time based on the survival time. Experimental results indicate that this definition is a more practical and time-saving method of addressing dynamic multiobjective optimization problems changing over time.

  10. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  11. A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling.

    Science.gov (United States)

    Bouguila, Nizar; Ziou, Djemel

    2010-01-01

    In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.

  12. Leading twist nuclear shadowing, nuclear generalized parton distributions and nuclear DVCS at small x

    Energy Technology Data Exchange (ETDEWEB)

    Guzey, Vadim; Goeke, Klaus; Siddikov, Marat

    2009-01-01

    We generalize the leading twist theory of nuclear shadowing and calculate quark and gluon generalized parton distributions (GPDs) of spinless nuclei. We predict very large nuclear shadowing for nuclear GPDs. In the limit of the purely transverse momentum transfer, our nuclear GPDs become impact parameter dependent nuclear parton distributions (PDFs). Nuclear shadowing induces non-trivial correlations between the impact parameter $b$ and the light-cone fraction $x$. We make predictions for the deeply virtual Compton scattering (DVCS) amplitude and the DVCS cross section on $^{208}$Pb at high energies. We calculate the cross section of the Bethe-Heitler (BH) process and address the issue of the extraction of the DVCS signal from the $e A \\to e \\gamma A$ cross section. We find that the $e A \\to e \\gamma A$ differential cross section is dominated by DVCS at the momentum transfer $t$ near the minima of the nuclear form factor. We also find that nuclear shadowing leads

  13. Topology of event distributions as a generalized definition of phase transitions in finite systems

    International Nuclear Information System (INIS)

    Chomaz, Ph.; Duflot, V.; Gulminelli, F.; Duflot, V.

    2000-01-01

    We propose a definition of phase transitions in finite systems based on topology anomalies of the event distribution in the space of observations. This generalizes all the definitions based on the curvature anomalies of thermodynamical potentials and provides a natural definition of order parameters. It is directly operational from the experimental point of view. It allows to study phase transitions in Gibbs equilibria as well as in other ensembles such as the Tsallis ensemble. (author)

  14. Self-organization property of Kohonen's map with general type of stimuli distribution

    OpenAIRE

    Sadeghi, Ali A.

    1997-01-01

    Here the self-organization property of one-dimensional Kohonen's algorithm in its 2k-neighbour setting with a general type of stimuli distribution and non-increasing learning rate is considered. We prove that the probability of self-organization for all initial values of neurons is uniformly positive. For the special case of a constant learning rate, it implies that the algorithm self-organizes with probability one.

  15. Calculation of Stationary, Free Molecular Flux Distributions in General 3D Environments

    Science.gov (United States)

    Labello, Jesse

    2011-10-01

    This article presents an application of the angular coefficient method for diffuse reflection to calculate stationary molecular flux distributions in general three dimensional environments. The method of angular coefficients is reviewed and the integration of the method into Blender, a free, open-source, 3D modeling software package, is described. Some example calculations are compared to analytical and Direct Simulation Monte Carlo (DSMC) results with excellent agreement.

  16. A dielectric tensor for magnetoplasmas comprising components with generalized Lorentzian distributions

    International Nuclear Information System (INIS)

    Mace, R.L.

    1996-01-01

    We report on a new form for the dielectric tensor for a plasma containing superthermal particles. The individual particle components are modelled by 3-dimensional isotropic kappa, or generalized Lorentzian, distributions with arbitrary real-valued index κ. The new dielectric tensor is valid for arbitrary wavevectors. The dielectric tensor, which resembles Trubnikov's dielectric tensor for a relativistic plasma, is compared with the familiar Maxwellian form. When the dielectric tensor is used in the plasma dispersion relation for waves propagating parallel to the magnetic field it reproduces previously derived dispersion relations for various electromagnetic and electrostatic waves in plasmas modelled by Lorentzian particle distributions. Within the constraints of propagation parallel to the ambient magnetic field, we extend the above results to incorporate loss-cone Lorentzian particle distributions, which have important applications in laboratory mirror devices, as well as in space and astrophysical environments. (orig.)

  17. Chiral perturbation theory and the first moments of the generalized parton distributions in a nucleon

    Energy Technology Data Exchange (ETDEWEB)

    Dorati, Marina [Dipartimento di Fisica Nucleare e Teorica, Universita' degli Studi di Pavia and INFN, Pavia (Italy); Physik-Department, Theoretische Physik T39, TU Muenchen, D-85747 Garching (Germany); Gail, Tobias A. [Physik-Department, Theoretische Physik T39, TU Muenchen, D-85747 Garching (Germany)], E-mail: tgail@ph.tum.de; Hemmert, Thomas R. [Physik-Department, Theoretische Physik T39, TU Muenchen, D-85747 Garching (Germany)

    2008-01-15

    We discuss the first moments of the parity-even Generalized Parton Distributions (GPDs) in a nucleon, corresponding to six (generalized) vector form factors. We evaluate these fundamental properties of baryon structure at low energies, utilizing the methods of covariant Chiral Perturbation Theory in the baryon sector (BChPT). Our analysis is performed at leading-one-loop order in BChPT, predicting both the momentum and the quark-mass dependence for the three (generalized) isovector and (generalized) isoscalar form factors, which are currently under investigation in lattice QCD analyses of baryon structure. We also study the limit of vanishing four-momentum transfer where the GPD-moments reduce to the well-known moments of Parton Distribution Functions (PDFs). For the isovector moment {sub u-d} our BChPT calculation predicts a new mechanism for chiral curvature, connecting the high values for this moment typically found in lattice QCD studies for large quark masses with the smaller value known from phenomenology. Likewise, we analyze the quark-mass dependence of the isoscalar moments in the forward limit and extract the contribution of quarks to the total spin of the nucleon. We close with a first glance at the momentum dependence of the isoscalar C-form factor of the nucleon.

  18. Pareto-Optimization of HTS CICC for High-Current Applications in Self-Field

    Directory of Open Access Journals (Sweden)

    Giordano Tomassetti

    2018-01-01

    Full Text Available The ENEA superconductivity laboratory developed a novel design for Cable-in-Conduit Conductors (CICCs comprised of stacks of 2nd-generation REBCO coated conductors. In its original version, the cable was made up of 150 HTS tapes distributed in five slots, twisted along an aluminum core. In this work, taking advantage of a 2D finite element model, able to estimate the cable’s current distribution in the cross-section, a multiobjective optimization procedure was implemented. The aim of optimization was to simultaneously maximize both engineering current density and total current flowing inside the tapes when operating in self-field, by varying the cross-section layout. Since the optimization process involved both integer and real geometrical variables, the choice of an evolutionary search algorithm was strictly necessary. The use of an evolutionary algorithm in the frame of a multiple objective optimization made it an obliged choice to numerically approach the problem using a nonstandard fast-converging optimization algorithm. By means of this algorithm, the Pareto frontiers for the different configurations were calculated, providing a powerful tool for the designer to achieve the desired preliminary operating conditions in terms of engineering current density and/or total current, depending on the specific application field, that is, power transmission cable and bus bar systems.

  19. Probability distribution of flood flows in Tunisia

    Science.gov (United States)

    Abida, H.; Ellouze, M.

    2008-05-01

    L (Linear) moments are used in identifying regional flood frequency distributions for different zones Tunisia wide. 1134 site-years of annual maximum stream flow data from a total of 42 stations with an average record length of 27 years are considered. The country is divided into two homogeneous regions (northern and central/southern Tunisia) using a heterogeneity measure, based on the spread of the sample L-moments among the sites in a given region. Then, selection of the corresponding distribution is achieved through goodness-of-fit comparisons in L-moment diagrams and verified using an L moment based regional test that compares observed to theoretical values of L-skewness and L-kurtosis for various candidate distributions. The distributions used, which represent five of the most frequently used distributions in the analysis of hydrologic extreme variables are: (i) Generalized Extreme Value (GEV), (ii) Pearson Type III (P3), (iii) Generalized Logistic (GLO), (iv) Generalized Normal (GN), and (v) Generalized Pareto (GPA) distributions. Spatial trends, with respect to the best-fit flood frequency distribution, are distinguished: Northern Tunisia was shown to be represented by the GNO distribution while the GNO and GEV distributions give the best fit in central/southern Tunisia.

  20. Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis

    DEFF Research Database (Denmark)

    Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei

    2018-01-01

    to optimize the controller coordination based on the Pareto optimization theory. Three solutions are obtained through optimization, which includes the optimal torque solution, optimal power solution, and satisfactory solution. Detailed comparisons evaluate the performance of the three selected solutions...

  1. Accident investigation of construction sites in Qom city using Pareto chart (2009-2012

    Directory of Open Access Journals (Sweden)

    M. H. Beheshti

    2015-07-01

    .Conclusions: Employing Pareto charts as a method for analyzing and identification of accident causes can have an effective role in the management of work-related accidents, proper allocation of funds and time.

  2. Selection of influential spreaders in complex networks using Pareto Shell decomposition

    Science.gov (United States)

    Yeruva, Sujatha; Devi, T.; Reddy, Y. Samtha

    2016-06-01

    The selection of prominent nodes in order to maximize the ability of spreading is very crucial in complex networks. The well known K-Shell method, which comprises nodes located at the core of a network, is better than the degree centrality and betweenness centrality, in capturing the spreading ability for a single origin spreader. As per the multiple origin spreaders, the K-Shell method fails to yield similar results when compared to the degree centrality. Current research proposes a Pareto-Shell Decomposition. It employs Pareto front function. It's Pareto optimal set comprises non-dominated spreads, with the ratio of high out-degree to in-degree and high in-degree. Pareto-Shell decomposition outperforms the K-Shell and the degree centrality for multiple origin spreaders, with the simulation of epidemic spreading process.

  3. Global WASF-GA: An Evolutionary Algorithm in Multiobjective Optimization to Approximate the Whole Pareto Optimal Front.

    Science.gov (United States)

    Saborido, Rubén; Ruiz, Ana B; Luque, Mariano

    2017-01-01

    In this article, we propose a new evolutionary algorithm for multiobjective optimization called Global WASF-GA ( global weighting achievement scalarizing function genetic algorithm), which falls within the aggregation-based evolutionary algorithms. The main purpose of Global WASF-GA is to approximate the whole Pareto optimal front. Its fitness function is defined by an achievement scalarizing function (ASF) based on the Tchebychev distance, in which two reference points are considered (both utopian and nadir objective vectors) and the weight vector used is taken from a set of weight vectors whose inverses are well-distributed. At each iteration, all individuals are classified into different fronts. Each front is formed by the solutions with the lowest values of the ASF for the different weight vectors in the set, using the utopian vector and the nadir vector as reference points simultaneously. Varying the weight vector in the ASF while considering the utopian and the nadir vectors at the same time enables the algorithm to obtain a final set of nondominated solutions that approximate the whole Pareto optimal front. We compared Global WASF-GA to MOEA/D (different versions) and NSGA-II in two-, three-, and five-objective problems. The computational results obtained permit us to conclude that Global WASF-GA gets better performance, regarding the hypervolume metric and the epsilon indicator, than the other two algorithms in many cases, especially in three- and five-objective problems.

  4. A divide and conquer approach to determine the Pareto frontier for optimization of protein engineering experiments

    Science.gov (United States)

    He, Lu; Friedman, Alan M.; Bailey-Kellogg, Chris

    2016-01-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability vs. novelty, affinity vs. specificity, activity vs. immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not “dominated”; i.e., no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), in order to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, PEPFR (Protein Engineering Pareto FRontier), that hierarchically subdivides the objective space, employing appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. PMID:22180081

  5. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  6. Calculation of the dielectric tensor for a generalized Lorentzian (kappa) distribution function

    International Nuclear Information System (INIS)

    Summers, D.; Xue, S.; Thorne, R.M.

    1994-01-01

    Expressions are derived for the elements of the dielectric tensor for linear waves propagating at an arbitrary angle to a uniform magnetic field in a fully hot plasma whose constituent particle species σ are modeled by generalized Lorentzian distribution functions. The expressions involve readily computable single integrals whose integrands involve only elementary functions, Bessel functions, and modified plasma dispersion functions, the latter being available in the form of finite algebraic series. Analytical forms for the integrals are derived in the limits λ→0 and λ→∞, where λ=(k perpendicular ρ Lσ ) 2 /2, with k perpendicular the component of wave vector perpendicular to the ambient magnetic field, and ρ Lσ the Larmor radius for the particle species σ. Consideration is given to the important limits of wave propagation parallel and perpendicular to the ambient magnetic field, and also to the cold plasma limit. Since most space plasmas are well modeled by generalized Lorentzian particle distribution functions, the results obtained in this paper provide a powerful tool for analyzing kinetic (micro-) instabilities in space plasmas in a very general context, limited only by the assumptions of linear plasma theory

  7. Generalized parton distributions: confining potential effects within AdS/QCD

    Energy Technology Data Exchange (ETDEWEB)

    Traini, Marco [Universite Paris Saclay, CEA, Institut de Physique Theorique, Gif-sur-Yvette (France); Universita degli Studi di Trento, Dipartimento di Fisica, Trento (Italy); INFN-TIFPA, Trento (Italy)

    2017-04-15

    Generalized parton distributions are investigated within a holographic approach where the string modes in the fifth dimension describe the nucleon in a bottom-up or AdS/QCD framework. The aim is to bring the AdS/QCD results in the realm of phenomenology in order to extract consequences and previsions. Two main aspects are studied: (i) the role of the confining potential needed for breaking conformal invariance and introducing confinement (both: classic soft-wall and recent infra-red potentials are investigated); (ii) the extension of the predicted GPDs to the entire range of off-forward kinematics by means of double distributions. Higher Fock states are included describing the nucleon as a superposition of three valence quarks and quark-antiquark pairs and gluons. (orig.)

  8. Evaluation of water vapor distribution in general circulation models using satellite observations

    Science.gov (United States)

    Soden, Brian J.; Bretherton, Francis P.

    1994-01-01

    This paper presents a comparison of the water vapor distribution obtained from two general circulation models, the European Centre for Medium-Range Weather Forecasts (ECMWF) model and the National Center for Atmospheric Research (NCAR) Community Climate Model (CCM), with satellite observations of total precipitable water (TPW) from Special Sensor Microwave/Imager (SSM/I) and upper tropospheric relative humidity (UTH) from GOES. Overall, both models are successful in capturing the primary features of the observed water vapor distribution and its seasonal variation. For the ECMWF model, however, a systematic moist bias in TPW is noted over well-known stratocumulus regions in the eastern subtropical oceans. Comparison with radiosonde profiles suggests that this problem is attributable to difficulties in modeling the shallowness of the boundary layer and large vertical water vapor gradients which characterize these regions. In comparison, the CCM is more successful in capturing the low values of TPW in the stratocumulus regions, although it tends to exhibit a dry bias over the eastern half of the subtropical oceans and a corresponding moist bias in the western half. The CCM also significantly overestimates the daily variability of the moisture fields in convective regions, suggesting a problem in simulating the temporal nature of moisture transport by deep convection. Comparison of the monthly mean UTH distribution indicates generally larger discrepancies than were noted for TPW owing to the greater influence of large-scale dynamical processes in determining the distribution of UTH. In particular, the ECMWF model exhibits a distinct dry bias along the Intertropical Convergence Zone (ITCZ) and a moist bias over the subtropical descending branches of the Hadley cell, suggesting an underprediction in the strength of the Hadley circulation. The CCM, on the other hand, demonstrates greater discrepancies in UTH than are observed for the ECMWF model, but none that are as

  9. Cross-channel analysis of quark and gluon generalized parton distributions with helicity flip

    International Nuclear Information System (INIS)

    Pire, B.; Semenov-Tian-Shansky, K.; Szymanowski, L.; Wallon, S.

    2014-01-01

    Quark and gluon helicity flip generalized parton distributions (GPDs) address the transversity quark and gluon structure of the nucleon. In order to construct a theoretically consistent parametrization of these hadronic matrix elements, we work out the set of combinations of those GPDs suitable for the SO(3) partial wave (PW) expansion in the cross-channel. This universal result will help to build up a flexible parametrization of these important hadronic non-perturbative quantities, using, for instance, the approaches based on the conformal PW expansion of GPDs such as the Mellin-Barnes integral or the dual parametrization techniques. (orig.)

  10. Cross-channel analysis of quark and gluon generalized parton distributions with helicity flip

    Energy Technology Data Exchange (ETDEWEB)

    Pire, B. [CNRS, CPhT, Ecole Polytechnique, Palaiseau (France); Semenov-Tian-Shansky, K. [Universite de Liege, IFPA, Departement AGO, Liege (Belgium); Szymanowski, L. [National Centre for Nuclear Research (NCBJ), Warsaw (Poland); Wallon, S. [Universite de Paris-Sud, CNRS, LPT, Orsay (France); Universite Paris 06, Faculte de Physique, UPMC, Paris (France)

    2014-05-15

    Quark and gluon helicity flip generalized parton distributions (GPDs) address the transversity quark and gluon structure of the nucleon. In order to construct a theoretically consistent parametrization of these hadronic matrix elements, we work out the set of combinations of those GPDs suitable for the SO(3) partial wave (PW) expansion in the cross-channel. This universal result will help to build up a flexible parametrization of these important hadronic non-perturbative quantities, using, for instance, the approaches based on the conformal PW expansion of GPDs such as the Mellin-Barnes integral or the dual parametrization techniques. (orig.)

  11. A General Combinatorial Ant System-based Distributed Routing Algorithm for Communication Networks

    Directory of Open Access Journals (Sweden)

    Jose Aguilar

    2007-08-01

    Full Text Available In this paper, a general Combinatorial Ant System-based distributed routing algorithm modeled like a dynamic combinatorial optimization problem is presented. In the proposed algorithm, the solution space of the dynamic combinatorial optimization problem is mapped into the space where the ants will walk, and the transition probability and the pheromone update formula of the Ant System is defined according to the objective function of the communication problem. The general nature of the approach allows for the optimization of the routing function to be applied in different types of networks just changing the performance criteria to be optimized. In fact, we test and compare the performance of our routing algorithm against well-known routing schemes for wired and wireless networks, and show its superior performance in terms throughput, delay and energy efficiency.

  12. Pareto-Optimal Multi-objective Inversion of Geophysical Data

    Science.gov (United States)

    Schnaidt, Sebastian; Conway, Dennis; Krieger, Lars; Heinson, Graham

    2018-01-01

    In the process of modelling geophysical properties, jointly inverting different data sets can greatly improve model results, provided that the data sets are compatible, i.e., sensitive to similar features. Such a joint inversion requires a relationship between the different data sets, which can either be analytic or structural. Classically, the joint problem is expressed as a scalar objective function that combines the misfit functions of multiple data sets and a joint term which accounts for the assumed connection between the data sets. This approach suffers from two major disadvantages: first, it can be difficult to assess the compatibility of the data sets and second, the aggregation of misfit terms introduces a weighting of the data sets. We present a pareto-optimal multi-objective joint inversion approach based on an existing genetic algorithm. The algorithm treats each data set as a separate objective, avoiding forced weighting and generating curves of the trade-off between the different objectives. These curves are analysed by their shape and evolution to evaluate data set compatibility. Furthermore, the statistical analysis of the generated solution population provides valuable estimates of model uncertainty.

  13. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  14. Influence of Pareto optimality on the maximum entropy methods

    Science.gov (United States)

    Peddavarapu, Sreehari; Sunil, Gujjalapudi Venkata Sai; Raghuraman, S.

    2017-07-01

    Galerkin meshfree schemes are emerging as a viable substitute to finite element method to solve partial differential equations for the large deformations as well as crack propagation problems. However, the introduction of Shanon-Jayne's entropy principle in to the scattered data approximation has deviated from the trend of defining the approximation functions, resulting in maximum entropy approximants. Further in addition to this, an objective functional which controls the degree of locality resulted in Local maximum entropy approximants. These are based on information-theoretical Pareto optimality between entropy and degree of locality that are defining the basis functions to the scattered nodes. The degree of locality in turn relies on the choice of locality parameter and prior (weight) function. The proper choices of both plays vital role in attain the desired accuracy. Present work is focused on the choice of locality parameter which defines the degree of locality and priors: Gaussian, Cubic spline and quartic spline functions on the behavior of local maximum entropy approximants.

  15. A Pareto-optimal refinement method for protein design scaffolds.

    Science.gov (United States)

    Nivón, Lucas Gregorio; Moretti, Rocco; Baker, David

    2013-01-01

    Computational design of protein function involves a search for amino acids with the lowest energy subject to a set of constraints specifying function. In many cases a set of natural protein backbone structures, or "scaffolds", are searched to find regions where functional sites (an enzyme active site, ligand binding pocket, protein-protein interaction region, etc.) can be placed, and the identities of the surrounding amino acids are optimized to satisfy functional constraints. Input native protein structures almost invariably have regions that score very poorly with the design force field, and any design based on these unmodified structures may result in mutations away from the native sequence solely as a result of the energetic strain. Because the input structure is already a stable protein, it is desirable to keep the total number of mutations to a minimum and to avoid mutations resulting from poorly-scoring input structures. Here we describe a protocol using cycles of minimization with combined backbone/sidechain restraints that is Pareto-optimal with respect to RMSD to the native structure and energetic strain reduction. The protocol should be broadly useful in the preparation of scaffold libraries for functional site design.

  16. A Pareto-optimal refinement method for protein design scaffolds.

    Directory of Open Access Journals (Sweden)

    Lucas Gregorio Nivón

    Full Text Available Computational design of protein function involves a search for amino acids with the lowest energy subject to a set of constraints specifying function. In many cases a set of natural protein backbone structures, or "scaffolds", are searched to find regions where functional sites (an enzyme active site, ligand binding pocket, protein-protein interaction region, etc. can be placed, and the identities of the surrounding amino acids are optimized to satisfy functional constraints. Input native protein structures almost invariably have regions that score very poorly with the design force field, and any design based on these unmodified structures may result in mutations away from the native sequence solely as a result of the energetic strain. Because the input structure is already a stable protein, it is desirable to keep the total number of mutations to a minimum and to avoid mutations resulting from poorly-scoring input structures. Here we describe a protocol using cycles of minimization with combined backbone/sidechain restraints that is Pareto-optimal with respect to RMSD to the native structure and energetic strain reduction. The protocol should be broadly useful in the preparation of scaffold libraries for functional site design.

  17. Generalized Extreme Value Distribution Models for the Assessment of Seasonal Wind Energy Potential of Debuncha, Cameroon

    Directory of Open Access Journals (Sweden)

    Nkongho Ayuketang Arreyndip

    2016-01-01

    Full Text Available The method of generalized extreme value family of distributions (Weibull, Gumbel, and Frechet is employed for the first time to assess the wind energy potential of Debuncha, South-West Cameroon, and to study the variation of energy over the seasons on this site. The 29-year (1983–2013 average daily wind speed data over Debuncha due to missing values in the years 1992 and 1994 is gotten from NASA satellite data through the RETScreen software tool provided by CANMET Canada. The data is partitioned into min-monthly, mean-monthly, and max-monthly data and fitted using maximum likelihood method to the two-parameter Weibull, Gumbel, and Frechet distributions for the purpose of determining the best fit to be used for assessing the wind energy potential on this site. The respective shape and scale parameters are estimated. By making use of the P values of the Kolmogorov-Smirnov statistic (K-S and the standard error (s.e analysis, the results show that the Frechet distribution best fits the min-monthly, mean-monthly, and max-monthly data compared to the Weibull and Gumbel distributions. Wind speed distributions and wind power densities of both the wet and dry seasons are compared. The results show that the wind power density of the wet season was higher than in the dry season. The wind speeds at this site seem quite low; maximum wind speeds are listed as between 3.1 and 4.2 m/s, which is below the cut-in wind speed of many modern turbines (6–10 m/s. However, we recommend the installation of low cut-in wind turbines like the Savonius or Aircon (10 KW for stand-alone low energy need.

  18. Fitness function distributions over generalized search neighborhoods in the q-ary hypercube.

    Science.gov (United States)

    Sutton, Andrew M; Chicano, Francisco; Whitley, L Darrell

    2013-01-01

    The frequency distribution of a fitness function over regions of its domain is an important quantity for understanding the behavior of algorithms that employ randomized sampling to search the function. In general, exactly characterizing this distribution is at least as hard as the search problem, since the solutions typically live in the tails of the distribution. However, in some cases it is possible to efficiently retrieve a collection of quantities (called moments) that describe the distribution. In this paper, we consider functions of bounded epistasis that are defined over length-n strings from a finite alphabet of cardinality q. Many problems in combinatorial optimization can be specified as search problems over functions of this type. Employing Fourier analysis of functions over finite groups, we derive an efficient method for computing the exact moments of the frequency distribution of fitness functions over Hamming regions of the q-ary hypercube. We then use this approach to derive equations that describe the expected fitness of the offspring of any point undergoing uniform mutation. The results we present provide insight into the statistical structure of the fitness function for a number of combinatorial problems. For the graph coloring problem, we apply our results to efficiently compute the average number of constraint violations that lie within a certain number of steps of any coloring. We derive an expression for the mutation rate that maximizes the expected fitness of an offspring at each fitness level. We also apply the results to the slightly more complex frequency assignment problem, a relevant application in the domain of the telecommunications industry. As with the graph coloring problem, we provide formulas for the average value of the fitness function in Hamming regions around a solution and the expectation-optimal mutation rate.

  19. Distributing Correlation Coefficients of Linear Structure-Activity/Property Models

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACA

    2011-12-01

    Full Text Available Quantitative structure-activity/property relationships are mathematical relationships linking chemical structure and activity/property in a quantitative manner. These in silico approaches are frequently used to reduce animal testing and risk-assessment, as well as to increase time- and cost-effectiveness in characterization and identification of active compounds. The aim of our study was to investigate the pattern of correlation coefficients distribution associated to simple linear relationships linking the compounds structure with their activities. A set of the most common ordnance compounds found at naval facilities with a limited data set with a range of toxicities on aquatic ecosystem and a set of seven properties was studied. Statistically significant models were selected and investigated. The probability density function of the correlation coefficients was investigated using a series of possible continuous distribution laws. Almost 48% of the correlation coefficients proved fit Beta distribution, 40% fit Generalized Pareto distribution, and 12% fit Pert distribution.

  20. Deeply virtual Compton scattering with the CLAS detector for the study of generalized parton distributions

    International Nuclear Information System (INIS)

    Girod, F.X.

    2006-12-01

    The structure of the nucleon, among the first fundamental problems in hadronic physics, is the subject of a renewed interest. The lightest baryonic state has historically been described in two complementary approaches: through elastic scattering, measuring form factors which reflect the spatial shape of charge distributions, and through deep inelastic scattering, providing access to parton distribution functions which encode the momentum content carried by the constituents. The recently developed formalism of Generalized Parton Distributions unifies those approaches and provides access to new informations. The cleanest process sensitive to GPDs is the deeply virtual Compton scattering (DVCS) contributing to the ep → epγ reaction. This work deals with a dedicated experiment accomplished with the CLAS detector, completed with two specific equipments: a lead tungstate calorimeter covering photon detection at small angles, and a superconducting solenoid actively shielding the electromagnetic background. The entire project is covered: from the upgrade of the experimental setup, through the update of the software, data taking and analysis, up to a first comparison of the beam spin asymmetry to model predictions. (author)

  1. Statistical analysis of latent generalized correlation matrix estimation in transelliptical distribution.

    Science.gov (United States)

    Han, Fang; Liu, Han

    2017-02-01

    Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.

  2. Fluctuations in the heliospheric hydrogen distribution induced by generalized and time-dependent interstellar boundary conditions

    International Nuclear Information System (INIS)

    Scherer, K.; Fahr, H.J.

    1990-01-01

    It is well known that the neutral component of the local interstellar medium (LISM) can effectively pass through the plasma interface ahead of the solar system and can penetrate deeply into the inner heliosphere. Here we present a newly-developed theoretical approach to describe the distribution function of LISM neutral hydrogen in the heliosphere, also taking into account time-dependent solar and interstellar boundary conditions. For this purpose we start from a Boltzmann-Vlasov equation, Fourier-transformed with respect to space and time coordinates, in connection with correspondingly transformed solar radiation forces and ionization rates, and then arrive at semi-analytic solutions for the transformed hydrogen velocity distribution function. As interstellar boundary conditions we allow for very general, non-Maxwellian and time-dependent distribution functions to account for the case that some LISM turbulence patterns or non-linear wave-like shock structures pass over the solar system. We consider this theoretical approach to be an ideal instrument for the synoptic interpretation of huge data samples on interplanetary Ly-α resonance glow intensities registered from different celestial directions over extended periods of time. In addition we feel that the theoretical approach presented here, when applied to interplanetary resonance glow data, may permit the detection of genuine fluctuations in the local interstellar medium. (author)

  3. Entropy maximization under the constraints on the generalized Gini index and its application in modeling income distributions

    Science.gov (United States)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2015-11-01

    In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.

  4. The Aggregation of Individual Distributive Preferences through the Distributive Liberal Social Contract : Normative Analysis.

    OpenAIRE

    Jean Mercier-Ythier

    2010-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is both Pareto-efficient relative to individual interdependent preferences, and unanimously weak...

  5. An all-timescales rainfall probability distribution

    Science.gov (United States)

    Papalexiou, S. M.; Koutsoyiannis, D.

    2009-04-01

    The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.

  6. Efficient behavior of photosynthetic organelles via Pareto optimality, identifiability, and sensitivity analysis.

    Science.gov (United States)

    Carapezza, Giovanni; Umeton, Renato; Costanza, Jole; Angione, Claudio; Stracquadanio, Giovanni; Papini, Alessio; Lió, Pietro; Nicosia, Giuseppe

    2013-05-17

    In this work, we develop methodologies for analyzing and cross comparing metabolic models. We investigate three important metabolic networks to discuss the complexity of biological organization of organisms, modeling, and system properties. In particular, we analyze these metabolic networks because of their biotechnological and basic science importance: the photosynthetic carbon metabolism in a general leaf, the Rhodobacter spheroides bacterium, and the Chlamydomonas reinhardtii alga. We adopt single- and multi-objective optimization algorithms to maximize the CO 2 uptake rate and the production of metabolites of industrial interest or for ecological purposes. We focus both on the level of genes (e.g., finding genetic manipulations to increase the production of one or more metabolites) and on finding concentration enzymes for improving the CO 2 consumption. We find that R. spheroides is able to absorb an amount of CO 2 until 57.452 mmol h (-1) gDW (-1) , while C. reinhardtii obtains a maximum of 6.7331. We report that the Pareto front analysis proves extremely useful to compare different organisms, as well as providing the possibility to investigate them with the same framework. By using the sensitivity and robustness analysis, our framework identifies the most sensitive and fragile components of the biological systems we take into account, allowing us to compare their models. We adopt the identifiability analysis to detect functional relations among enzymes; we observe that RuBisCO, GAPDH, and FBPase belong to the same functional group, as suggested also by the sensitivity analysis.

  7. Parameter Estimation for Coupled Hydromechanical Simulation of Dynamic Compaction Based on Pareto Multiobjective Optimization

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2015-01-01

    Full Text Available This paper presented a parameter estimation method based on a coupled hydromechanical model of dynamic compaction and the Pareto multiobjective optimization technique. The hydromechanical model of dynamic compaction is established in the FEM program LS-DYNA. The multiobjective optimization algorithm, Nondominated Sorted Genetic Algorithm (NSGA-IIa, is integrated with the numerical model to identify soil parameters using multiple sources of field data. A field case study is used to demonstrate the capability of the proposed method. The observed pore water pressure and crater depth at early blow of dynamic compaction are simultaneously used to estimate the soil parameters. Robustness of the back estimated parameters is further illustrated by a forward prediction. Results show that the back-analyzed soil parameters can reasonably predict lateral displacements and give generally acceptable predictions of dynamic compaction for an adjacent location. In addition, for prediction of ground response of the dynamic compaction at continuous blows, the prediction based on the second blow is more accurate than the first blow due to the occurrence of the hardening and strengthening of soil during continuous compaction.

  8. Diversity shrinkage: Cross-validating pareto-optimal weights to enhance diversity via hiring practices.

    Science.gov (United States)

    Song, Q Chelsea; Wee, Serena; Newman, Daniel A

    2017-12-01

    To reduce adverse impact potential and improve diversity outcomes from personnel selection, one promising technique is De Corte, Lievens, and Sackett's (2007) Pareto-optimal weighting strategy. De Corte et al.'s strategy has been demonstrated on (a) a composite of cognitive and noncognitive (e.g., personality) tests (De Corte, Lievens, & Sackett, 2008) and (b) a composite of specific cognitive ability subtests (Wee, Newman, & Joseph, 2014). Both studies illustrated how Pareto-weighting (in contrast to unit weighting) could lead to substantial improvement in diversity outcomes (i.e., diversity improvement), sometimes more than doubling the number of job offers for minority applicants. The current work addresses a key limitation of the technique-the possibility of shrinkage, especially diversity shrinkage, in the Pareto-optimal solutions. Using Monte Carlo simulations, sample size and predictor combinations were varied and cross-validated Pareto-optimal solutions were obtained. Although diversity shrinkage was sizable for a composite of cognitive and noncognitive predictors when sample size was at or below 500, diversity shrinkage was typically negligible for a composite of specific cognitive subtest predictors when sample size was at least 100. Diversity shrinkage was larger when the Pareto-optimal solution suggested substantial diversity improvement. When sample size was at least 100, cross-validated Pareto-optimal weights typically outperformed unit weights-suggesting that diversity improvement is often possible, despite diversity shrinkage. Implications for Pareto-optimal weighting, adverse impact, sample size of validation studies, and optimizing the diversity-job performance tradeoff are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Mass hierarchy and energy scaling of the Tsallis - Pareto parameters in hadron productions at RHIC and LHC energies

    Science.gov (United States)

    Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Shen, Keming

    2018-02-01

    The latest, high-accuracy identified hadron spectra measurements in highenergy nuclear collisions led us to the investigation of the strongly interacting particles and collective effects in small systems. Since microscopical processes result in a statistical Tsallis - Pareto distribution, the fit parameters q and T are well suited for identifying system size scalings and initial conditions. Moreover, parameter values provide information on the deviation from the extensive, Boltzmann - Gibbs statistics in finite-volumes. We apply here the fit procedure developed in our earlier study for proton-proton collisions [1, 2]. The observed mass and center-of-mass energy trends in the hadron production are compared to RHIC dAu and LHC pPb data in different centrality/multiplicity classes. Here we present new results on mass hierarchy in pp and pA from light to heavy hadrons.

  10. A distributed-memory hierarchical solver for general sparse linear systems

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Chao [Stanford Univ., CA (United States). Inst. for Computational and Mathematical Engineering; Pouransari, Hadi [Stanford Univ., CA (United States). Dept. of Mechanical Engineering; Rajamanickam, Sivasankaran [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Center for Computing Research; Boman, Erik G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Center for Computing Research; Darve, Eric [Stanford Univ., CA (United States). Inst. for Computational and Mathematical Engineering and Dept. of Mechanical Engineering

    2017-12-20

    We present a parallel hierarchical solver for general sparse linear systems on distributed-memory machines. For large-scale problems, this fully algebraic algorithm is faster and more memory-efficient than sparse direct solvers because it exploits the low-rank structure of fill-in blocks. Depending on the accuracy of low-rank approximations, the hierarchical solver can be used either as a direct solver or as a preconditioner. The parallel algorithm is based on data decomposition and requires only local communication for updating boundary data on every processor. Moreover, the computation-to-communication ratio of the parallel algorithm is approximately the volume-to-surface-area ratio of the subdomain owned by every processor. We also provide various numerical results to demonstrate the versatility and scalability of the parallel algorithm.

  11. On a general class of regular rotating black holes based on a smeared mass distribution

    Directory of Open Access Journals (Sweden)

    Alexis Larranaga

    2015-04-01

    Full Text Available In this work we investigate the behavior of a new general class of rotating regular black holes based on a non-Gaussian smeared mass distribution. It is shown that the existence of a fundamental minimal length cures the well-known problems in the terminal phase of black hole evaporation, since we find that there is a finite maximum temperature that the black hole reaches before cooling down to absolute zero, so that the evaporation ends up in a zero temperature extremal black hole whose mass and size depends on the value of the fundamental length and on the rotation parameter of the black hole. We also study the geodesic structure in these spacetimes and calculate the shadows that these black holes produce.

  12. Optimization of Wind Turbine Airfoil Using Nondominated Sorting Genetic Algorithm and Pareto Optimal Front

    Directory of Open Access Journals (Sweden)

    Ziaul Huque

    2012-01-01

    Full Text Available A Computational Fluid Dynamics (CFD and response surface-based multiobjective design optimization were performed for six different 2D airfoil profiles, and the Pareto optimal front of each airfoil is presented. FLUENT, which is a commercial CFD simulation code, was used to determine the relevant aerodynamic loads. The Lift Coefficient (CL and Drag Coefficient (CD data at a range of 0° to 12° angles of attack (α and at three different Reynolds numbers (Re=68,459, 479, 210, and 958, 422 for all the six airfoils were obtained. Realizable k-ε turbulence model with a second-order upwind solution method was used in the simulations. The standard least square method was used to generate response surface by the statistical code JMP. Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II was used to determine the Pareto optimal set based on the response surfaces. Each Pareto optimal solution represents a different compromise between design objectives. This gives the designer a choice to select a design compromise that best suits the requirements from a set of optimal solutions. The Pareto solution set is presented in the form of a Pareto optimal front.

  13. Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2011-01-01

    Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.

  14. Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation

    Energy Technology Data Exchange (ETDEWEB)

    Zio, E., E-mail: enrico.zio@ecp.f [European Foundation for New Energy-EDF, Systems Science and Energetic Challenge, Ecole Centrale Paris-Supelec, Paris (France); Dipartimento di Energia, Politecnico di Milano, Milano (Italy); Bazzo, R. [Dipartimento di Energia, Politecnico di Milano, Milano (Italy)

    2011-05-15

    Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.

  15. Coherent deeply virtual Compton scattering off 3He and neutron generalized parton distributions

    Directory of Open Access Journals (Sweden)

    Rinaldi Matteo

    2014-06-01

    Full Text Available It has been recently proposed to study coherent deeply virtual Compton scattering (DVCS off 3He nuclei to access neutron generalized parton distributions (GPDs. In particular, it has been shown that, in Impulse Approximation (IA and at low momentum transfer, the sum of the quark helicity conserving GPDs of 3He, H and E, is dominated by the neutron contribution. This peculiar result makes the 3He target very promising to access the neutron information. We present here the IA calculation of the spin dependent GPD H See Formula in PDF of 3He. Also for this quantity the neutron contribution is found to be the dominant one, at low momentum transfer. The known forward limit of the IA calculation of H See Formula in PDF , yielding the polarized parton distributions of 3He, is correctly recovered. The extraction of the neutron information could be anyway non trivial, so that a procedure, able to take into account the nuclear effects encoded in the IA analysis, is proposed. These calculations, essential for the evaluation of the coherent DVCS cross section asymmetries, which depend on the GPDs H,E and H See Formula in PDF , represent a crucial step for planning possible experiments at Jefferson Lab.

  16. Online Learning of Hierarchical Pitman-Yor Process Mixture of Generalized Dirichlet Distributions With Feature Selection.

    Science.gov (United States)

    Fan, Wentao; Sallay, Hassen; Bouguila, Nizar

    2017-09-01

    In this paper, a novel statistical generative model based on hierarchical Pitman-Yor process and generalized Dirichlet distributions (GDs) is presented. The proposed model allows us to perform joint clustering and feature selection thanks to the interesting properties of the GD distribution. We develop an online variational inference algorithm, formulated in terms of the minimization of a Kullback-Leibler divergence, of our resulting model that tackles the problem of learning from high-dimensional examples. This variational Bayes formulation allows simultaneously estimating the parameters, determining the model's complexity, and selecting the appropriate relevant features for the clustering structure. Moreover, the proposed online learning algorithm allows data instances to be processed in a sequential manner, which is critical for large-scale and real-time applications. Experiments conducted using challenging applications, namely, scene recognition and video segmentation, where our approach is viewed as an unsupervised technique for visual learning in high-dimensional spaces, showed that the proposed approach is suitable and promising.

  17. Species abundance distributions in neutral models with immigration or mutation and general lifetimes.

    Science.gov (United States)

    Lambert, Amaury

    2011-07-01

    We consider a general, neutral, dynamical model of biodiversity. Individuals have i.i.d. lifetime durations, which are not necessarily exponentially distributed, and each individual gives birth independently at constant rate λ. Thus, the population size is a homogeneous, binary Crump-Mode-Jagers process (which is not necessarily a Markov process). We assume that types are clonally inherited. We consider two classes of speciation models in this setting. In the immigration model, new individuals of an entirely new species singly enter the population at constant rate μ (e.g., from the mainland into the island). In the mutation model, each individual independently experiences point mutations in its germ line, at constant rate θ. We are interested in the species abundance distribution, i.e., in the numbers, denoted I(n)(k) in the immigration model and A(n)(k) in the mutation model, of species represented by k individuals, k = 1, 2, . . . , n, when there are n individuals in the total population. In the immigration model, we prove that the numbers (I(t)(k); k ≥ 1) of species represented by k individuals at time t, are independent Poisson variables with parameters as in Fisher's log-series. When conditioning on the total size of the population to equal n, this results in species abundance distributions given by Ewens' sampling formula. In particular, I(n)(k) converges as n → ∞ to a Poisson r.v. with mean γ/k, where γ : = μ/λ. In the mutation model, as n → ∞, we obtain the almost sure convergence of n (-1) A(n)(k) to a nonrandom explicit constant. In the case of a critical, linear birth-death process, this constant is given by Fisher's log-series, namely n(-1) A(n)(k) converges to α(k)/k, where α : = λ/(λ + θ). In both models, the abundances of the most abundant species are briefly discussed.

  18. ComPASS : a tool for distributed parallel finite volume discretizations on general unstructured polyhedral meshes

    Directory of Open Access Journals (Sweden)

    Dalissier E.

    2013-12-01

    Full Text Available The objective of the ComPASS project is to develop a parallel multiphase Darcy flow simulator adapted to general unstructured polyhedral meshes (in a general sense with possibly non planar faces and to the parallelization of advanced finite volume discretizations with various choices of the degrees of freedom such as cell centres, vertices, or face centres. The main targeted applications are the simulation of CO2 geological storage, nuclear waste repository and reservoir simulations. The CEMRACS 2012 summer school devoted to high performance computing has been an ideal framework to start this collaborative project. This paper describes what has been achieved during the four weeks of the CEMRACS project which has been focusing on the implementation of basic features of the code such as the distributed unstructured polyhedral mesh, the synchronization of the degrees of freedom, and the connection to scientific libraries including the partitioner METIS, the visualization tool PARAVIEW, and the parallel linear solver library PETSc. The parallel efficiency of this first version of the ComPASS code has been validated on a toy parabolic problem using the Vertex Approximate Gradient finite volume spatial discretization with both cell and vertex degrees of freedom, combined with an Euler implicit time integration.

  19. Influence of distributed delays on the dynamics of a generalized immune system cancerous cells interactions model

    Science.gov (United States)

    Piotrowska, M. J.; Bodnar, M.

    2018-01-01

    We present a generalisation of the mathematical models describing the interactions between the immune system and tumour cells which takes into account distributed time delays. For the analytical study we do not assume any particular form of the stimulus function describing the immune system reaction to presence of tumour cells but we only postulate its general properties. We analyse basic mathematical properties of the considered model such as existence and uniqueness of the solutions. Next, we discuss the existence of the stationary solutions and analytically investigate their stability depending on the forms of considered probability densities that is: Erlang, triangular and uniform probability densities separated or not from zero. Particular instability results are obtained for a general type of probability densities. Our results are compared with those for the model with discrete delays know from the literature. In addition, for each considered type of probability density, the model is fitted to the experimental data for the mice B-cell lymphoma showing mean square errors at the same comparable level. For estimated sets of parameters we discuss possibility of stabilisation of the tumour dormant steady state. Instability of this steady state results in uncontrolled tumour growth. In order to perform numerical simulation, following the idea of linear chain trick, we derive numerical procedures that allow us to solve systems with considered probability densities using standard algorithm for ordinary differential equations or differential equations with discrete delays.

  20. Universal cervical length screening for singleton pregnancies with no history of preterm delivery, or the inverse of the Pareto principle.

    Science.gov (United States)

    Rozenberg, P

    2017-06-01

    Ultrasound measurement of cervical length in the general population enables the identification of women at risk for spontaneous preterm delivery. Vaginal progesterone is effective in reducing the risk of preterm delivery in this population. This screening associated with treatment by vaginal progesterone is cost-effective. Universal screening of cervical length can therefore be considered justified. Nonetheless, this screening will not appreciably reduce the preterm birth prevalence: in France or UK, where the preterm delivery rate is around 7.4%, this strategy would make it possible to reduce it only to 7.0%. This small benefit must be set against the considerable effort required in terms of screening ultrasound scans. Universal ultrasound screening of cervical length is the inverse of Pareto's principle: a small benefit against a considerable effort. © 2016 Royal College of Obstetricians and Gynaecologists.

  1. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    Science.gov (United States)

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  2. Strength Pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization.

    Science.gov (United States)

    Elhossini, Ahmed; Areibi, Shawki; Dony, Robert

    2010-01-01

    This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.

  3. Multi-objective particle swarm optimization using Pareto-based set and aggregation approach

    Science.gov (United States)

    Huang, Song; Wang, Yan; Ji, Zhicheng

    2017-07-01

    Multi-objective optimization problems (MOPs) need to be solved in real world recently. In this paper, a multi-objective particle swarm optimization based on Pareto set and aggregation approach was proposed to deal with MOPs. Firstly, velocities and positions were updated similar to PSO. Then, global-best set was defined in particle swarm optimizer to preserve Pareto-based set obtained by the population. Specifically, a hybrid updating strategy based on Pareto set and aggregation approach was introduced to update the global-best set and local search was carried on global-best set. Thirdly, personal-best positions were updated in decomposition way, and global-best position was selected from global-best set. Finally, ZDT instances and DTLZ instances were selected to evaluate the performance of MULPSO and the results show validity of the proposed algorithm for MOPs.

  4. Pemodelan Sistem Interaksi Obat dengan Menggunakan Fuzzy Inference System dan Pareto Optimality

    Directory of Open Access Journals (Sweden)

    Elena Yustina

    2013-01-01

    Pareto optimality is a popular concept in the determination optimal solution of multiobjective problems. In determining the optimal solution of multiobjective problem should pay attention for each objective function, frequently conflicting objective functions. The interaction of two drugs has two objective function that is maximizing the positive effects and minimize negative effects. So its use is necessary to find optimal solutions to achieve the expected therapeutic. This research using Fuzzy Inference Sistem (FIS to determine the appropriate medication to keep blood pressure and blood glucose levels of patients with hypertension and diabetes under control in normal and Pareto optimality to determine drug optimal solution. Fuzzy Inference System generates output choice of drug classes based on fuzzy rules in accordance with the patient's disease condition. Pareto optimality produces a pair solution for diabetes and hypertension drug that satisfy thresholds the minimum effective level (Minimum Effective Concentration; MEC and maximum toxic levels (Minimum Toxic Concentration; MTC of each drug.

  5. General

    Indian Academy of Sciences (India)

    Page S20: NMR compound 4i. Page S22: NMR compound 4j. General: Chemicals were purchased from Fluka, Merck and Aldrich Chemical Companies. All the products were characterized by comparison of their IR, 1H NMR and 13C NMR spectroscopic data and their melting points with reported values. General procedure ...

  6. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  7. Bi-objective optimization for multi-modal transportation routing planning problem based on Pareto optimality

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2015-09-01

    Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.

  8. Histopathologic Distribution of Appendicitis at Dr. Hasan Sadikin General Hospital, Bandung, Indonesia, in 2012

    Directory of Open Access Journals (Sweden)

    Tara Zhafira

    2017-03-01

    Full Text Available Background: Appendicitis is a medical emergency and a common cause of emergency surgeries worldwide. Its frequency is varied based on many factors, including age and sex. Histopathologic examination is a gold standard for diagnosis, and complications like gangrene formation and perforation lead to high mortality and morbidity in almost all age groups. This study was conducted to describe the distribution pattern of appendicitis according to age, sex, and histopathologic type. Methods: This cross-sectional study was carried out in the Department of Pathology Anatomy, Dr. Hasan Sadikin General Hospital, Bandung, Indonesia, from August–October 2013. Secondary data were obtained from medical records of January 1st to December 31st, 2012. A total of 503 out of 516 cases were included to be reviewed. Age, sex, and histopathologic type from medical records were then evaluated. Any specific case and perforation were also noted. Results: Data showed the highest prevalence of appendicitis occurred in the 10- 19 age group (28.4% and in the female group (52.3%. Acute appendicitis was more common than chronic appendicitis in both sexes and all age groups. Perforation rate was high (41.4%, and was more prevalent in male (54.9% and in the 0–9 age group (65.7%. Conclusions: Appendicitis, both acute and chronic, is more distributed in the second decade, and is slightly more prevalent in females. Acute cases are more common than chronic. Perforation rate is significant and peaks in the first decade and in males. [AMJ.2017;4(1:36–41

  9. Distribution functions for a family of general-relativistic hypervirial models in the collisionless regime

    Science.gov (United States)

    Gauy, Henrique Matheus; Ramos-Caro, Javier

    2018-03-01

    By considering the Einstein-Vlasov system for static spherically symmetric distributions of matter, we show that configurations with constant anisotropy parameter β , leading to asymptotically flat spacetimes, have necessarily a distribution function (DF) of the form F =l-2 βξ (ɛ ) , where ɛ =E /m and l =L /m are the relativistic energy and angular momentum per unit rest mass, respectively. We exploit this result to obtain DFs for the general relativistic extension of the hypervirial family introduced by Nguyen and Lingam [Mon. Not. R. Astron. Soc. 436, 2014 (2013), 10.1093/mnras/stt1719], which Newtonian potential is given by ϕ (r )=-ϕo/[1 +(r /a )n]1 /n (a and ϕo are positive free parameters, n =1 ,2 ,… ). Such DFs can be written in the form Fn=ln -2ξn(ɛ ) . For odd n , we find that ξn is a polynomial of order 2 n +1 in ɛ , as in the case of the Hernquist model (n =1 ), for which F1∝l-1(2 ɛ -1 ) (ɛ-1 ) 2 . For even n , we can write ξn in terms of incomplete beta functions (Plummer model, n =2 , is an example). Since we demand that F ≥0 throughout the phase space, the particular form of each ξn leads to restrictions for the values of ϕo. For example, for the Hernquist model we find that 0 ≤ϕo≤2 /3 , i.e., an upper bounding value less than the one obtained for Nguyen and Lingam (0 ≤ϕo≤1 ), based on energy conditions.

  10. A New Methodology to Select the Preferred Solutions from the Pareto-optimal Set: Application to Polymer Extrusion

    International Nuclear Information System (INIS)

    Ferreira, Jose C.; Gaspar-Cunha, Antonio; Fonseca, Carlos M.

    2007-01-01

    Most of the real world optimization problems involve multiple, usually conflicting, optimization criteria. Generating Pareto optimal solutions plays an important role in multi-objective optimization, and the problem is considered to be solved when the Pareto optimal set is found, i.e., the set of non-dominated solutions. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to produce the complete set of non-dominated solutions. However, this is not allays enough since the aim is not only to know the Pareto set but, also, to obtain one solution from this Pareto set. Thus, the definition of a methodology able to select a single solution from the set of non-dominated solutions (or a region of the Pareto frontier), and taking into account the preferences of a Decision Maker (DM), is necessary. A different method, based on a weighted stress function, is proposed. It is able to integrate the user's preferences in order to find the best region of the Pareto frontier accordingly with these preferences. This method was tested on some benchmark test problems, with two and three criteria, and on a polymer extrusion problem. This methodology is able to select efficiently the best Pareto-frontier region for the specified relative importance of the criteria

  11. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  12. Application of the Pareto chart and Ishikawa diagram for the identification of major defects in metal composite castings

    Directory of Open Access Journals (Sweden)

    K. Gawdzińska

    2011-04-01

    Full Text Available This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.

  13. The Successor Function and Pareto Optimal Solutions of Cooperative Differential Systems with Concavity. I

    DEFF Research Database (Denmark)

    Andersen, Kurt Munk; Sandqvist, Allan

    1997-01-01

    We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution.......We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution....

  14. On Usage of Pareto curves to Select Wind Turbine Controller Tunings to the Wind Turbulence Level

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh

    2015-01-01

    Model predictive control has in recently publications shown its potential for lowering of cost of energy of modern wind turbines. Pareto curves can be used to evaluate performance of these controllers with multiple conflicting objectives of power and fatigue loads. In this paper an approach...... to update an model predictive wind turbine controller tuning as the wind turbulence increases, as increased turbulence levels results in higher loads for the same controller tuning. In this paper the Pareto curves are computed using an industrial high fidelity aero-elastic model. Simulations show...

  15. [Distribution of rubidium, cesium, beryllium, strontium, and barium in blood and urine in general Chinese population].

    Science.gov (United States)

    Ding, Chunguang; Pan, Yajuan; Zhang, Aihua; Zhu, Chun; Liu, Deye; Xu, Guang; Zheng, Yuxin; Yan, Huifang

    2015-12-01

    To investigate the distribution of rubidium (Rb), cesium (Cs), beryllium (Be), strontium (Sr), and barium (Ba) in blood and urine in general Chinese population. A total of 18 120 subjects aged 6~60 years were enrolled from 24 regions in 8 provinces in Eastern, Central, and Western China from 2009 to 2010 based on the method of cluster random sampling. Questionnaire survey was conducted to collect the data on living environment and health status. Blood and urine samples were collected from these subjects, and the levels of Rb, Cs, Be, Sr, and Ba in these samples were determined by inductively coupled plasma mass spectrometry. The distribution of these elements in blood and urine in male or female subjects living in different regions was analyzed statistically. In the general Chinese population, the concentration of Be in the whole blood was below the detection limit (0.06 μg/L); the geometric mean (GM) of Ba in the whole blood was below the detection limit (0.45 μg/L), with the 95th percentile (P95)of 1.37 μg/L; the GMs (95% CI)of Rb, Cs, and Sr in the whole blood were 2 374(2 357~2 392) μg/L, 2.01 (1.98~2.05) μg/L, and 23.5 (23.3~23.7) μg/L, respectively; in males and females, the GMs (95%CI)of blood Rb, Cs, and Sr were 2 506 (2 478~2 533) μg/L and 2 248 (2 227~2 270) μg/L, 1.88 (1.83~1.94) μg/L and 2.16 (2.11~2.20) μg/L, and 23.4 (23.1~23.7) μg/L and 23.6 (23.3~23.9) μg/L, respectively(P0.05, and P>0.05). In the general Chinese population, the GM of urine Be was below the detection limit (0.06 μg/L), while the GMs (95%CI)of urine Rb, Cs, Sr, and Ba were 854 (836~873) μg/L, 3.65 (3.56~3.74) μg/L, 39.5 (38.4~40.6) μg/L, and 1.10 (1.07~1.12) μg/L, respectively; in males and females, the GMs (95%CI)of urine Rb, Cs, Sr, and Ba were 876 (849~904) μg/L and 832 (807~858) μg/L, 3.83 (3.70~3.96) μg/L and 3.47 (3.35~3.60) μg/L, 42.5 (40.9~44.2) μg/L and 36.6 (35.1~38.0) μg/L, and 1.15 (1.12~1.19) μg/L and 1.04 (1.01~1.07) μg/L, respectively (all P< 0

  16. [Distribution of copper and zinc in blood among general population from 8 provinces in China].

    Science.gov (United States)

    Pan, Xingfu; Ding, Chunguang; Pan, Yajuan; Zhang, Aihua; Wu, Banghua; Huang, Hanlin; Zhu, Chun; Liu, Deye; Zhu, Baoli; Xu, Guang; Shao, Hua; Peng, Shanzhuo; Jiang, Xianlong; Zhao, Chunxiang; Han, Changcheng; Ji, Hongrong; Yu, Shanfa; Zhang, Xiaoxi; Zhang, Longlian; Zheng, Yuxin; Yan, Huifang

    2014-02-01

    To investigate the level of zinc (Zn) and copper (Cu) in whole blood among general population from 8 provinces in China, and to analyze the characteristics of distribution among different regions. This cross-sectional study was performed in 8 provinces from eastern, middle and western China between 2009 and 2010, including 13 110 subjects from 24 regions, and the blood and urine samples were collected. The ICP-MS was applied to test the content of ICP-MS in blood samples, and the results were used to analyze the characteristics of contents and distributions of Zn and Cu among population from different ages, genders and regions groups. Totally, the mean (95%CI) contents of Cu and Zn in blood were 795 (791-799)µg/L and 3 996(3 976-4 015) µg/L, respectively. The characteristics of distribution of Cu content were as followed, the content of males were lower than it of females (male:767 µg/L; female: 822 µg/L, t = -13.302, P aging 6-12, 13-16, 17-20, 21-30, 31-45 and 46-60 years old were separately 860(853-868), 758(748-769), 734(728-734), 782(774-790), 811(795-827) and 820(815-826) µg/L. The differences showed statistical significance (F = 78.77, P China (800µg/L) were also significantly higher than it in middle (774 µg/L)and western China (782 µg/L) (F = 10.94, P aging 6-12, 13-16, 17-20, 21-30, 31-45 and 46-60 years old were separately 3 306 (3 261-3 350), 3 888 (3 839-3 937), 3 948 (3 902-3 994), 4 272(4 228-4 315), 4 231(4 180-4 281) and 4 250 (4 205-4 294)µg/L, which showed significant statistical differences (F = 233.68, P China (3 938 µg/L) were significantly lower than it in middle (4 237 µg/L)and western China (4 105 µg/L) (F = 53.16, P ages, genders and regions. The baseline data of this study provided reliable scientific evidence for further research.

  17. Derived properties from the dipole and generalized oscillator strength distributions of an endohedral confined hydrogen atom

    Science.gov (United States)

    Martínez-Flores, C.; Cabrera-Trujillo, R.

    2018-03-01

    We report the electronic properties of a hydrogen atom confined by a fullerene molecule by obtaining the eigenvalues and eigenfunctions of the time-independent Schrödinger equation by means of a finite-differences approach. The hydrogen atom confinement by a C60 fullerene cavity is accounted for by two model potentials: a square-well and a Woods–Saxon. The Woods–Saxon potential is implemented to study the role of a smooth cavity on the hydrogen atom generalized oscillator strength distribution. Both models characterize the cavity by an inner radius R 0, thickness Δ, and well depth V 0. We use two different values for R 0 and Δ, found in the literature, that characterize H@C60 to analyze the role of the fullerene cage size and width. The electronic properties of the confined hydrogen atom are reported as a function of the well depth V 0, emulating different electronic configurations of the endohedral cavity. We report results for the hyper-fine splitting, nuclear magnetic screening, dipole oscillator strength, the static and dynamic polarizability, mean excitation energy, photo-ionization, and stopping cross section for the confined hydrogen atom. We find that there is a critical potential well depth value around V 0 = 0.7 a.u. for the first set of parameters and around V 0 = 0.9 a.u. for the second set of parameters, which produce a drastic change in the electronic properties of the endohedral hydrogen system. These values correspond to the first avoided crossing on the energy levels. Furthermore, a clear discrepancy is found between the square-well and Woods–Saxon model potential results on the hydrogen atom generalized oscillator strength due to the square-well discontinuity. These differences are reflected in the stopping cross section for protons colliding with H@C60.

  18. Measurements of ELF electromagnetic exposure of the general public from Belgian power distribution substations.

    Science.gov (United States)

    Joseph, Wout; Verloock, Leen; Martens, Luc

    2008-01-01

    In this paper, the exposure of the general public due to distribution substations of 11/0.22-0.4 kV is investigated. The substations are categorized according to their location (substations in buildings, detached substations, substations between two houses, and underground substations in the pavement), and eight relevant substations are selected to perform measurements of the electromagnetic fields. The purpose of this paper is to determine the "minimum distances" for the general public--defined as the distances outside which the field levels do not exceed a certain field value--of these substations. In total, 637 field measurements were performed: 358 measurements of the magnetic field and 279 measurements of the electric field in different locations. Measured momentary magnetic field values are within the range of 0.025 to 47.39 microT. Electric fields are within the range 0.1 to 536 V m(-1). Also, magnetic field measurements as a function of the height above the ground were performed. The maximal magnetic (values over one day) and electric fields for all the investigated substations were below 100 microT and 5 kV m(-1), respectively. For exposure over a year, all substations except one delivered values below 100 microT. For the substation producing a magnetic field above 100 microT, a minimum distance of about 0.5 m was obtained. When comparing the average exposure with the value of 0.4 microT, minimum distances of maximally 5.4 m (average day) and 7.2 m (average year) were obtained.

  19. Comment 1 on workshop in economics - a note on benefit-cost analysis and the distribution of benefits: The greenhouse effect

    International Nuclear Information System (INIS)

    Quinn, K.G.

    1992-01-01

    The application of benefit-cost analysis to environmental problems in general, and to global warming as demonstrated by Kosobud in particular, is a very useful tool. Depending upon the limitations of the relevant data available benefit-cost analysis can offer information to society about how to improve its condition. However, beyond the criticism of its estimate of the Pareto optimal point benefit-cost analysis suffers from a fundamental weakness: It cannot speak to the distribution of the net benefits of implementation of an international greenhouse policy. Within an individual country, debate on a particular policy intervention can effectively separate the issues of achieving a potential Pareto optimum and distributing the benefits necessary to actually accomplish Pareto optimality. This situation occurs because (theoretically, anyway) these decisions are made in the presence of a binding enforcement regime that can redistribute benefits as seen fit. A policy can then be introduced in the manner that achieves the best overall net benefits, and the allocation of these benefits can be treated as a stand-alone problem

  20. TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification

    International Nuclear Information System (INIS)

    Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D

    2014-01-01

    Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been

  1. Identifying the preferred subset of enzymatic profiles in nonlinear kinetic metabolic models via multiobjective global optimization and Pareto filters.

    Directory of Open Access Journals (Sweden)

    Carlos Pozo

    Full Text Available Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study

  2. Identifying the preferred subset of enzymatic profiles in nonlinear kinetic metabolic models via multiobjective global optimization and Pareto filters.

    Science.gov (United States)

    Pozo, Carlos; Guillén-Gosálbez, Gonzalo; Sorribas, Albert; Jiménez, Laureano

    2012-01-01

    Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the

  3. Evaluation of a compound distribution based on weather pattern subsampling for extreme rainfall in Norway

    Directory of Open Access Journals (Sweden)

    J. Blanchet

    2015-12-01

    SCHADEX method for extreme flood estimation. Regional scores of evaluation are used in a split sample framework to compare the MEWP distribution with more general heavy-tailed distributions, in this case the Multi Generalized Pareto Weather Pattern (MGPWP distribution. The analysis shows the clear benefit obtained from seasonal and weather pattern-based subsampling for extreme value estimation. The MEWP distribution is found to have an overall better performance as compared with the MGPWP, which tends to overfit the data and lacks robustness. Finally, we take advantage of the split sample framework to present evidence for an increase in extreme rainfall in the southwestern part of Norway during the period 1979–2009, relative to 1948–1978.

  4. Concentration and size distribution of particles in abstracted groundwater.

    Science.gov (United States)

    van Beek, C G E M; de Zwart, A H; Balemans, M; Kooiman, J W; van Rosmalen, C; Timmer, H; Vandersluys, J; Stuyfzand, P J

    2010-02-01

    Particle number concentrations have been counted and particle size distributions calculated in groundwater derived by abstraction wells. Both concentration and size distribution are governed by the discharge rate: the higher this rate the higher the concentration and the higher the proportion of larger particles. However, the particle concentration in groundwater derived from abstraction wells, with high groundwater flow velocities, is much lower than in groundwater from monitor wells, with minimal flow velocities. This inconsistency points to exhaustion of the particle supply in the aquifer around wells due to groundwater abstraction for many years. The particle size distribution can be described with the help of a power law or Pareto distribution. Comparing the measured particle size distribution with the Pareto distribution shows that particles with a diameter >7 microm are under-represented. As the particle size distribution is dependent on the flow velocity, so is the value of the "Pareto" slope beta. (c) 2009 Elsevier Ltd. All rights reserved.

  5. Optimal Allocation of Generalized Power Sources in Distribution Network Based on Multi-Objective Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Li Ran

    2017-01-01

    Full Text Available Optimal allocation of generalized power sources in distribution network is researched. A simple index of voltage stability is put forward. Considering the investment and operation benefit, the stability of voltage and the pollution emissions of generalized power sources in distribution network, a multi-objective optimization planning model is established. A multi-objective particle swarm optimization algorithm is proposed to solve the optimal model. In order to improve the global search ability, the strategies of fast non-dominated sorting, elitism and crowding distance are adopted in this algorithm. Finally, tested the model and algorithm by IEEE-33 node system to find the best configuration of GP, the computed result shows that with the generalized power reasonable access to the active distribution network, the investment benefit and the voltage stability of the system is improved, and the proposed algorithm has better global search capability.

  6. Generalized least squares and empirical Bayes estimation in regional partial duration series index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...... parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified...... families of prior distributions. The regional method is applied to flood records from 48 New Zealand catchments. In the case of a strongly heterogeneous intersite correlation structure, the GLS procedure provides a more efficient estimate of the regional GP shape parameter as compared to the usually...

  7. Approximating the Pareto set of multiobjective linear programs via robust optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    We consider problems with multiple linear objectives and linear constraints and use adjustable robust optimization and polynomial optimization as tools to approximate the Pareto set with polynomials of arbitrarily large degree. The main difference with existing techniques is that we optimize a

  8. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2011-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for infinite horizon cooperative differential games. We consider games defined by non autonomous and discounted autonomous systems. The obtained results are used to analyze the regular

  9. Discrepancies between selected Pareto optimal plans and final deliverable plans in radiotherapy multi-criteria optimization.

    Science.gov (United States)

    Kyroudi, Archonteia; Petersson, Kristoffer; Ghandour, Sarah; Pachoud, Marc; Matzinger, Oscar; Ozsahin, Mahmut; Bourhis, Jean; Bochud, François; Moeckli, Raphaël

    2016-08-01

    Multi-criteria optimization provides decision makers with a range of clinical choices through Pareto plans that can be explored during real time navigation and then converted into deliverable plans. Our study shows that dosimetric differences can arise between the two steps, which could compromise the clinical choices made during navigation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Prederivatives of gamma paraconvex set-valued maps and Pareto optimality conditions for set optimization problems.

    Science.gov (United States)

    Huang, Hui; Ning, Jixian

    2017-01-01

    Prederivatives play an important role in the research of set optimization problems. First, we establish several existence theorems of prederivatives for γ -paraconvex set-valued mappings in Banach spaces with [Formula: see text]. Then, in terms of prederivatives, we establish both necessary and sufficient conditions for the existence of Pareto minimal solution of set optimization problems.

  11. Approximating the Pareto Set of Multiobjective Linear Programs via Robust Optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    Abstract: The Pareto set of a multiobjective optimization problem consists of the solutions for which one or more objectives can not be improved without deteriorating one or more other objectives. We consider problems with linear objectives and linear constraints and use Adjustable Robust

  12. Searching for the Pareto frontier in multi-objective protein design.

    Science.gov (United States)

    Nanda, Vikas; Belure, Sandeep V; Shir, Ofer M

    2017-08-01

    The goal of protein engineering and design is to identify sequences that adopt three-dimensional structures of desired function. Often, this is treated as a single-objective optimization problem, identifying the sequence-structure solution with the lowest computed free energy of folding. However, many design problems are multi-state, multi-specificity, or otherwise require concurrent optimization of multiple objectives. There may be tradeoffs among objectives, where improving one feature requires compromising another. The challenge lies in determining solutions that are part of the Pareto optimal set-designs where no further improvement can be achieved in any of the objectives without degrading one of the others. Pareto optimality problems are found in all areas of study, from economics to engineering to biology, and computational methods have been developed specifically to identify the Pareto frontier. We review progress in multi-objective protein design, the development of Pareto optimization methods, and present a specific case study using multi-objective optimization methods to model the tradeoff between three parameters, stability, specificity, and complexity, of a set of interacting synthetic collagen peptides.

  13. Household Labour Supply in Britain and Denmark: Some Interpretations Using a Model of Pareto Optimal Behaviour

    DEFF Research Database (Denmark)

    Barmby, Tim; Smith, Nina

    1996-01-01

    This paper analyses the labour supply behaviour of households in Denmark and Britain. It employs models in which the preferences of individuals within the household are explicitly represented. The households are then assumed to decide on their labour supply in a Pareto-Optimal fashion. Describing...

  14. Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks.

    Science.gov (United States)

    Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio

    2010-05-01

    This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.

  15. A sensitivity measure of the Pareto set in a vector linfnity-extreme combinatorial problem

    Directory of Open Access Journals (Sweden)

    V.A. Emelichev

    2001-12-01

    Full Text Available We consider a vector minimization problem on system of subsets of finite set with Chebyshev norm in a space of perturbing parameters. The behavior of the Pareto set as a function of parameters of partial criteria of the kind MINMAX of absolute value is investigated.

  16. Any Non-Individualistic Social Welfare Function Violates the Pareto Principle

    OpenAIRE

    Louis Kaplow; Steven Shavell

    1999-01-01

    The public at large, many policymakers, and some economists hold views of social welfare that attach some importance to factors other than individuals' utilities. This note shows that any such non-individualistic notion of social welfare conflicts with the Pareto principle.

  17. Spectral-Efficiency - Illumination Pareto Front for Energy Harvesting Enabled VLC System

    KAUST Repository

    Abdelhady, Amr Mohamed Abdelaziz

    2017-12-13

    The continuous improvement in optical energy harvesting devices motivates visible light communication (VLC) system developers to utilize such available free energy sources. An outdoor VLC system is considered where an optical base station sends data to multiple users that are capable of harvesting the optical energy. The proposed VLC system serves multiple users using time division multiple access (TDMA) with unequal time and power allocation, which are allocated to improve the system performance. The adopted optical system provides users with illumination and data communication services. The outdoor optical design objective is to maximize the illumination, while the communication design objective is to maximize the spectral efficiency (SE). The design objectives are shown to be conflicting, therefore, a multiobjective optimization problem is formulated to obtain the Pareto front performance curve for the proposed system. To this end, the marginal optimization problems are solved first using low complexity algorithms. Then, based on the proposed algorithms, a low complexity algorithm is developed to obtain an inner bound of the Pareto front for the illumination-SE tradeoff. The inner bound for the Pareto-front is shown to be close to the optimal Pareto-frontier via several simulation scenarios for different system parameters.

  18. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  19. Particle size-shape distributions: the general spheroid problem. I. Mathematical model.

    Science.gov (United States)

    Orive, L M

    1976-08-01

    The development of stereological methods for the study of dilute phases of particles, voids or organelles embedded in a matrix, from measurements made on plane or linear intercepts through the aggregate, has deserved a great deal of effort. With almost no exception, the problem of describing the particulate phase is reduced to that of identifying the statistical distribution--histogram in practice--of a relevant size parameter, with the previous assumption that the particles are modelled by geometrical objects of a constant shape (e.g. spheres). Therefore, particles exhibiting a random variation about a given type of shape as well as a random variation in size, escape previous analyses. Such is the case of unequiaxed particles modelled by triaxial ellipsoids of variable size and eccentricity parameters. It has been conjectured (Moran, 1972) that this problem is indetermined in its generally (i.e. the elliptical sections do not furnish a sufficient information which permits a complete description of the ellipsoids). A proof of this conjecture is given in the Appendix. When the ellipsoids are biaxial (spheroids) and of the same type (prolate or oblate), the problem is identifiable. Previous attempts to solve it assume statistical independence between size and shape. A complete, theoretical solution of the spheroids problem--with the independence condition relaxed--is presented. A number of exact relationships--some of them of a striking simplicity--linking particle properties (e.g. mean-mean caliper length, mean axial ratio, correlation coefficient between principal diameters, etc.) on the one hand, with the major and minor dimensions of the ellipses of section on the other, emerge, and natural, consistent estimators of the mentioned properties are made easily accessible for practical computation. Finally, the scope and limitations of the mathematical model are discussed.

  20. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  1. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning.

    Science.gov (United States)

    Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2009-10-21

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  2. Pareto navigation-algorithmic foundation of interactive multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Monz, M; Kuefer, K H; Bortfeld, T R; Thieke, C

    2008-01-01

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle-a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far

  3. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2009-01-01

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  4. Ranking of microRNA target prediction scores by Pareto front analysis.

    Science.gov (United States)

    Sahoo, Sudhakar; Albrecht, Andreas A

    2010-12-01

    Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure, which encourages further research towards a higher-dimensional analysis of Pareto fronts. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Pareto navigation: algorithmic foundation of interactive multi-criteria IMRT planning.

    Science.gov (United States)

    Monz, M; Küfer, K H; Bortfeld, T R; Thieke, C

    2008-02-21

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle -- a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far.

  6. Approximate analysis of non-stationary loss queues and networks of loss queues with general service time distributions

    OpenAIRE

    Izady, N; Worthington, D J

    2011-01-01

    A Fixed Point Approximation (FPA) method has recently been suggested for non-stationary analysis of loss queues and networks of loss queues with Exponential service times. Deriving exact equations relating time-dependent mean numbers of busy servers to blocking probabilities, we generalize the FPA method to loss systems with general service time distributions. These equations are combined with associated formulae for stationary analysis of loss systems in steady state through a carried load t...

  7. Power Flow Calculation for Weakly Meshed Distribution Networks with Multiple DGs Based on Generalized Chain-table Storage Structure

    DEFF Research Database (Denmark)

    Chen, Shuheng; Hu, Weihao; Chen, Zhe

    2014-01-01

    Based on generalized chain-table storage structure (GCTSS), a novel power flow method is proposed, which can be used to solve the power flow of weakly meshed distribution networks with multiple distributed generators (DGs). GCTSS is designed based on chain-table technology and its target...... done on the modified version of the IEEE 69-bus distribution system. The results verify that the proposed method can keep a good efficiency level. Hence, it is promising to calculate the power flow of weakly meshed distribution networks with multiple DGs....... is to describe the topology of radial distribution networks with a clear logic and a small memory size. The strategies of compensating the equivalent currents of break-point branches and the reactive power outputs of PV-type DGs are presented on the basis of superposition theorem. Their formulations...

  8. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  9. Energy distributions of Bianchi type-VIh Universe in general relativity ...

    Indian Academy of Sciences (India)

    We have found exactly the same results for Einstein, Bergmann–Thomson and Landau–Lifshitz energy–momentum distributions in Bianchi type- V I h metric for different gravitation theories. The energy–momentum distributions of the Bianchi type- V I h metric are found to be zero for h = −1 in GR and TG. However, our ...

  10. 26 CFR 1.332-1 - Distributions in liquidation of subsidiary corporation; general.

    Science.gov (United States)

    2010-04-01

    ... property received upon complete liquidations such as described in this section. See section 453(d)(4)(A... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Distributions in liquidation of subsidiary... TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Corporate Liquidations § 1.332-1 Distributions in...

  11. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  12. The generalized 20/80 law using probabilistic fractals applied to petroleum field size

    Science.gov (United States)

    Crovelli, R.A.

    1995-01-01

    Fractal properties of the Pareto probability distribution are used to generalize "the 20/80 law." The 20/80 law is a heuristic law that has evolved over the years into the following rule of thumb for many populations: 20 percent of the population accounts for 80 percent of the total value. The general p100/q100 law in probabilistic form is defined with q as a function of p, where p is the population proportion and q is the proportion of total value. Using the Pareto distribution, the p100/q100 law in fractal form is derived with the parameter q being a fractal, where q unexpectedly possesses the scale invariance property. The 20/80 law is a special case of the p100/q100 law in fractal form. The p100/q100 law in fractal form is applied to petroleum fieldsize data to obtain p and q such that p100% of the oil fields greater than any specified scale or size in a geologic play account for q100% of the total oil of the fields. The theoretical percentages of total resources of oil using the fractal q are extremely close to the empirical percentages from the data using the statistic q. Also, the empirical scale invariance property of the statistic q for the petroleum fieldsize data is in excellent agreement with the theoretical scale invariance property of the fractal q. ?? 1995 Oxford University Press.

  13. Modelling and Pareto optimization of mechanical properties of friction stir welded AA7075/AA5083 butt joints using neural network and particle swarm algorithm

    International Nuclear Information System (INIS)

    Shojaeefard, Mohammad Hasan; Behnagh, Reza Abdi; Akbari, Mostafa; Givi, Mohammad Kazem Besharati; Farhani, Foad

    2013-01-01

    Highlights: ► Defect-free friction stir welds have been produced for AA5083-O/AA7075-O. ► Back-propagation was sufficient for predicting hardness and tensile strength. ► A hybrid multi-objective algorithm is proposed to deal with this MOP. ► Multi-objective particle swarm optimization was used to find the Pareto solutions. ► TOPSIS is used to rank the given alternatives of the Pareto solutions. -- Abstract: Friction Stir Welding (FSW) has been successfully used to weld similar and dissimilar cast and wrought aluminium alloys, especially for aircraft aluminium alloys, that generally present with low weldability by the traditional fusion welding process. This paper focuses on the microstructural and mechanical properties of the Friction Stir Welding (FSW) of AA7075-O to AA5083-O aluminium alloys. Weld microstructures, hardness and tensile properties were evaluated in as-welded condition. Tensile tests indicated that mechanical properties of the joint were better than in the base metals. An Artificial Neural Network (ANN) model was developed to simulate the correlation between the Friction Stir Welding parameters and mechanical properties. Performance of the ANN model was excellent and the model was employed to predict the ultimate tensile strength and hardness of butt joint of AA7075–AA5083 as functions of weld and rotational speeds. The multi-objective particle swarm optimization was used to obtain the Pareto-optimal set. Finally, the Technique for Order Preference by Similarity to the Ideal Solution (TOPSIS) was applied to determine the best compromised solution.

  14. Calculating and controlling the error of discrete representations of Pareto surfaces in convex multi-criteria optimization.

    Science.gov (United States)

    Craft, David

    2010-10-01

    A discrete set of points and their convex combinations can serve as a sparse representation of the Pareto surface in multiple objective convex optimization. We develop a method to evaluate the quality of such a representation, and show by example that in multiple objective radiotherapy planning, the number of Pareto optimal solutions needed to represent Pareto surfaces of up to five dimensions grows at most linearly with the number of objectives. The method described is also applicable to the representation of convex sets. Copyright © 2009 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. A Novel Multivariate Generalized Skew-Normal Distribution with Two Parameters BGSNn, m (λ1, λ2

    Directory of Open Access Journals (Sweden)

    Fathi B.

    2014-07-01

    Full Text Available In this paper we first introduce a new class of multivariate generalized asymmetric skew-normal distributions with two parameters λ1,λ2 that we present it by BGSNn, m (λ1,λ2, and we finally obtain some special properties of BGSNnm(λ1,λ2.

  16. A Novel Multivariate Generalized Skew-Normal Distribution with Two Parameters BGSNn, m (λ1, λ2)

    OpenAIRE

    Fathi B.; Hasanalipour P.

    2014-01-01

    In this paper we first introduce a new class of multivariate generalized asymmetric skew-normal distributions with two parameters λ1,λ2 that we present it by BGSNn, m (λ1,λ2), and we finally obtain some special properties of BGSNnm(λ1,λ2).

  17. Comparison of general obesity and measures of body fat distribution in older adults in relation to cancer risk

    NARCIS (Netherlands)

    Freisling, Heinz; Arnold, Melina; Soerjomataram, Isabelle; O'Doherty, Mark George; Ordóñez-Mena, José Manuel; Bamia, Christina; Kampman, Ellen; Leitzmann, Michael; Romieu, Isabelle; Kee, Frank

    2017-01-01

    Background:We evaluated the associations of anthropometric indicators of general obesity (body mass index, BMI), an established risk factor of various cancer, and body fat distribution (waist circumference, WC; hip circumference, HC; and waist-to-hip ratio, WHR), which may better reflect

  18. Fisher's method of combining dependent statistics using generalizations of the gamma distribution with applications to genetic pleiotropic associations.

    Science.gov (United States)

    Li, Qizhai; Hu, Jiyuan; Ding, Juan; Zheng, Gang

    2014-04-01

    A classical approach to combine independent test statistics is Fisher's combination of $p$-values, which follows the $\\chi ^2$ distribution. When the test statistics are dependent, the gamma distribution (GD) is commonly used for the Fisher's combination test (FCT). We propose to use two generalizations of the GD: the generalized and the exponentiated GDs. We study some properties of mis-using the GD for the FCT to combine dependent statistics when one of the two proposed distributions are true. Our results show that both generalizations have better control of type I error rates than the GD, which tends to have inflated type I error rates at more extreme tails. In practice, common model selection criteria (e.g. Akaike information criterion/Bayesian information criterion) can be used to help select a better distribution to use for the FCT. A simple strategy of the two generalizations of the GD in genome-wide association studies is discussed. Applications of the results to genetic pleiotrophic associations are described, where multiple traits are tested for association with a single marker.

  19. Molecular dynamics equation designed for realizing arbitrary density: Application to sampling method utilizing the Tsallis generalized distribution

    International Nuclear Information System (INIS)

    Fukuda, Ikuo; Nakamura, Haruki

    2010-01-01

    Several molecular dynamics techniques applying the Tsallis generalized distribution are presented. We have developed a deterministic dynamics to generate an arbitrary smooth density function ρ. It creates a measure-preserving flow with respect to the measure ρdω and realizes the density ρ under the assumption of the ergodicity. It can thus be used to investigate physical systems that obey such distribution density. Using this technique, the Tsallis distribution density based on a full energy function form along with the Tsallis index q ≥ 1 can be created. From the fact that an effective support of the Tsallis distribution in the phase space is broad, compared with that of the conventional Boltzmann-Gibbs (BG) distribution, and the fact that the corresponding energy-surface deformation does not change energy minimum points, the dynamics enhances the physical state sampling, in particular for a rugged energy surface spanned by a complicated system. Other feature of the Tsallis distribution is that it provides more degree of the nonlinearity, compared with the case of the BG distribution, in the deterministic dynamics equation, which is very useful to effectively gain the ergodicity of the dynamical system constructed according to the scheme. Combining such methods with the reconstruction technique of the BG distribution, we can obtain the information consistent with the BG ensemble and create the corresponding free energy surface. We demonstrate several sampling results obtained from the systems typical for benchmark tests in MD and from biomolecular systems.

  20. "From Plato to Pareto": The Western Civilization Course Reconsidered.

    Science.gov (United States)

    Mullaney, Marie Marmo

    1986-01-01

    Discusses the importance of historical study within general education. Reviews the rise and fall of the Western Civilization course as the core of general education in the humanities. Suggests ways a revised version of this course can be restored to a central place in the curriculum. (AYC)

  1. Generalization of DNA microarray dispersion properties: microarray equivalent of t-distribution

    DEFF Research Database (Denmark)

    Novak, Jaroslav P; Kim, Seon-Young; Xu, Jun

    2006-01-01

    , laboratory conditions and type of chips. These coefficients are very closely correlated with Student's t-distribution. CONCLUSION: In this study we ascertained that the non-systematic variations possess Gaussian distribution, determined the probability intervals and demonstrated that the K...... different types and we demonstrate that the Gaussian (normal) frequency distribution is characteristic for the variability of gene expression values. However, typically 5 to 15% of the samples deviate from normality. Furthermore, it is shown that the frequency distributions of the difference of expression...... deviation derived from the consecutive samples is equivalent to the standard deviation obtained from individual genes. Finally, we determine the boundaries of probability intervals and demonstrate that the coefficients defining the intervals are independent of sample characteristics, variability of data...

  2. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  3. An encoding technique for multiobjective evolutionary algorithms applied to power distribution system reconfiguration.

    Science.gov (United States)

    Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E

    2014-01-01

    Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  4. Distribution of cocaine on banknotes in general circulation in England and Wales.

    Science.gov (United States)

    Aitken, C G G; Wilson, A; Sleeman, R; Morgan, B E M; Huish, J

    2017-01-01

    A study of the quantities of cocaine on banknotes in general circulation was conducted to investigate regional variations across England and Wales. No meaningful support was found for the proposition that there is regional variation in the quantities of cocaine in banknotes in general circulation in England and Wales. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Generalized negative binomial distribution: a promising statistical distribution for Oncomelania hupensis in the lake- and marsh-land regions of China.

    Science.gov (United States)

    Zhang, Z J; Ong, S H; Lynn, H S; Peng, W X; Zhou, Y B; Zhao, G M; Jiang, Q W

    2008-09-01

    A new generalization of the negative binomial distribution (GNBD) is introduced and fitted to counts of Oncomelania hupensis, the intermediate host of Schistosoma japonicum, made, in areas of Chinese lakeland and marshland, early in the winter of 2005 and late in the spring of 2006. The GNBD was found to fit the snail data better than the standard negative binomial distribution (NBD) that has previously been widely used to model the distribution of O. hupensis. With two more parameters than the NBD, the GNBD can integrate many discrete distributions and is more flexible than the NBD in modelling O. hupensis. It also provides a better theoretical distribution for the quantitative study of O. hupensis, especially in building an accurate prediction model of snail density. The justification for adopting the GNBD is discussed. The GNBD allows researchers to broaden the field in the quantitative study not only of O. hupensis and schistosomiasis japonica but also of other environment-related helminthiases and family-clustered diseases that have, traditionally, been modelled using the NBD.

  6. Optimization of externalities using DTM measures: a Pareto optimal multi objective optimization using the evolutionary algorithm SPEA2+

    NARCIS (Netherlands)

    Wismans, Luc Johannes Josephus; van Berkum, Eric C.; Bliemer, Michiel; Allkim, T.P.; van Arem, Bart

    2010-01-01

    Multi objective optimization of externalities of traffic is performed solving a network design problem in which Dynamic Traffic Management measures are used. The resulting Pareto optimal set is determined by employing the SPEA2+ evolutionary algorithm.

  7. MULTI-OBJECTIVE OPTIMAL DESIGN OF GROUNDWATER REMEDIATION SYSTEMS: APPLICATION OF THE NICHED PARETO GENETIC ALGORITHM (NPGA). (R826614)

    Science.gov (United States)

    A multiobjective optimization algorithm is applied to a groundwater quality management problem involving remediation by pump-and-treat (PAT). The multiobjective optimization framework uses the niched Pareto genetic algorithm (NPGA) and is applied to simultaneously minimize the...

  8. General solution of the chemical master equation and modality of marginal distributions for hierarchic first-order reaction networks.

    Science.gov (United States)

    Reis, Matthias; Kromer, Justus A; Klipp, Edda

    2018-01-20

    Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.

  9. Multivariate Pareto Minification Processes | Umar | Journal of the ...

    African Journals Online (AJOL)

    Autoregressive (AR) and autoregressive moving average (ARMA) processes with multivariate exponential (ME) distribution are presented and discussed. The theory of positive dependence is used to show that in many cases, multivariate exponential autoregressive (MEAR) and multivariate autoregressive moving average ...

  10. Pareto Principle in Datamining: an Above-Average Fencing Algorithm

    Directory of Open Access Journals (Sweden)

    K. Macek

    2008-01-01

    Full Text Available This paper formulates a new datamining problem: which subset of input space has the relatively highest output where the minimal size of this subset is given. This can be useful where usual datamining methods fail because of error distribution asymmetry. The paper provides a novel algorithm for this datamining problem, and compares it with clustering of above-average individuals.

  11. A divide-and-conquer approach to determine the Pareto frontier for optimization of protein engineering experiments.

    Science.gov (United States)

    He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris

    2012-03-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability versus novelty, affinity versus specificity, activity versus immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not "dominated"; that is, no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, Protein Engineering Pareto FRontier (PEPFR), that hierarchically subdivides the objective space, using appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. Copyright © 2011 Wiley Periodicals, Inc.

  12. A clinical distance measure for evaluating treatment plan quality difference with Pareto fronts in radiotherapy

    Directory of Open Access Journals (Sweden)

    Kristoffer Petersson

    2017-07-01

    Full Text Available We present a clinical distance measure for Pareto front evaluation studies in radiotherapy, which we show strongly correlates (r = 0.74 and 0.90 with clinical plan quality evaluation. For five prostate cases, sub-optimal treatment plans located at a clinical distance value of >0.32 (0.28–0.35 from fronts of Pareto optimal plans, were assessed to be of lower plan quality by our (12 observers (p < .05. In conclusion, the clinical distance measure can be used to determine if the difference between a front and a given plan (or between different fronts corresponds to a clinically significant plan quality difference.

  13. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives

    International Nuclear Information System (INIS)

    Warmflash, Aryeh; Siggia, Eric D; Francois, Paul

    2012-01-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input–output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria. (paper)

  14. A Pareto Algorithm for Efficient De Novo Design of Multi-functional Molecules.

    Science.gov (United States)

    Daeyaert, Frits; Deem, Micheal W

    2017-01-01

    We have introduced a Pareto sorting algorithm into Synopsis, a de novo design program that generates synthesizable molecules with desirable properties. We give a detailed description of the algorithm and illustrate its working in 2 different de novo design settings: the design of putative dual and selective FGFR and VEGFR inhibitors, and the successful design of organic structure determining agents (OSDAs) for the synthesis of zeolites. We show that the introduction of Pareto sorting not only enables the simultaneous optimization of multiple properties but also greatly improves the performance of the algorithm to generate molecules with hard-to-meet constraints. This in turn allows us to suggest approaches to address the problem of false positive hits in de novo structure based drug design by introducing structural and physicochemical constraints in the designed molecules, and by forcing essential interactions between these molecules and their target receptor. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives.

    Science.gov (United States)

    Warmflash, Aryeh; Francois, Paul; Siggia, Eric D

    2012-10-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input-output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria.

  16. Applying Pareto multi-criteria decision making in concurrent engineering: A case study of polyethylene industry

    Directory of Open Access Journals (Sweden)

    Akbar A. Tabriz

    2011-07-01

    Full Text Available Concurrent engineering (CE is one of the widest known techniques for simultaneous planning of product and process design. In concurrent engineering, design processes are often complicated with multiple conflicting criteria and discrete sets of feasible alternatives. Thus multi-criteria decision making (MCDM techniques are integrated into CE to perform concurrent design. This paper proposes a design framework governed by MCDM technique, which are in conflict in the sense of competing for common resources to achieve variously different performance objectives such as financial, functional, environmental, etc. The Pareto MCDM model is applied to polyethylene pipe concurrent design governed by four criteria to determine the best alternative design to Pareto-compromise design.

  17. Abdominal Adiposity Distribution Quantified by Ultrasound Imaging and Incident Hypertension in a General Population

    DEFF Research Database (Denmark)

    Seven, Ekim; Thuesen, Betina H; Linneberg, Allan

    2016-01-01

    Abdominal obesity is a major risk factor for hypertension. However, different distributions of abdominal adipose tissue may affect hypertension risk differently. The main purpose of this study was to explore the association of subcutaneous abdominal adipose tissue (SAT) and visceral adipose tissue...

  18. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.

    Science.gov (United States)

    Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric

    2010-07-20

    Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.

  19. Efficiency of Pareto joint inversion of 2D geophysical data using global optimization methods

    Science.gov (United States)

    Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek

    2016-04-01

    Pareto joint inversion of two or more sets of data is a promising new tool of modern geophysical exploration. In the first stage of our investigation we created software enabling execution of forward solvers of two geophysical methods (2D magnetotelluric and gravity) as well as inversion with possibility of constraining solution with seismic data. In the algorithm solving MT forward solver Helmholtz's equations, finite element method and Dirichlet's boundary conditions were applied. Gravity forward solver was based on Talwani's algorithm. To limit dimensionality of solution space we decided to describe model as sets of polygons, using Sharp Boundary Interface (SBI) approach. The main inversion engine was created using Particle Swarm Optimization (PSO) algorithm adapted to handle two or more target functions and to prevent acceptance of solutions which are non - realistic or incompatible with Pareto scheme. Each inversion run generates single Pareto solution, which can be added to Pareto Front. The PSO inversion engine was parallelized using OpenMP standard, what enabled execution code for practically unlimited amount of threads at once. Thereby computing time of inversion process was significantly decreased. Furthermore, computing efficiency increases with number of PSO iterations. In this contribution we analyze the efficiency of created software solution taking under consideration details of chosen global optimization engine used as a main joint minimization engine. Additionally we study the scale of possible decrease of computational time caused by different methods of parallelization applied for both forward solvers and inversion algorithm. All tests were done for 2D magnetotelluric and gravity data based on real geological media. Obtained results show that even for relatively simple mid end computational infrastructure proposed solution of inversion problem can be applied in practice and used for real life problems of geophysical inversion and interpretation.

  20. Application of isa and pareto diagram as management of the plots Lagoa Carapebus Serra / ES

    OpenAIRE

    Neumann, Bruna; Calmon, Ana Paula Santos; Aguiar, Marluce Martins

    2013-01-01

    Application of the Indicator of Environmental Health (ISA), with further elaboration of Pareto Diagram allowed to verify the sanitary and environmental conditions of the plots Lagoa Carapebus along with the use of primary data (field information) and application forms to the local community. These management tools include qualitative and quantitative aspects of public services. The final ISA presented a situation of average health, because high scores of some components of the ISA provided th...

  1. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array.

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-20

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  2. Tapped density optimisation for four agricultural wastes - Part II: Performance analysis and Taguchi-Pareto

    Directory of Open Access Journals (Sweden)

    Ajibade Oluwaseyi Ayodele

    2016-01-01

    Full Text Available In this attempt, which is a second part of discussions on tapped density optimisation for four agricultural wastes (particles of coconut, periwinkle, palm kernel and egg shells, performance analysis for comparative basis is made. This paper pioneers a study direction in which optimisation of process variables are pursued using Taguchi method integrated with the Pareto 80-20 rule. Negative percentage improvements resulted when the optimal tapped density was compared with the average tapped density. However, the performance analysis between optimal tapped density and the peak tapped density values yielded positive percentage improvements for the four filler particles. The performance analysis results validate the effectiveness of using the Taguchi method in improving the tapped density properties of the filler particles. The application of the Pareto 80-20 rule to the table of parameters and levels produced revised tables of parameters and levels which helped to identify the factor-levels position of each parameter that is economical to optimality. The Pareto 80-20 rule also produced revised S/N response tables which were used to know the relevant S/N ratios that are relevant to optimality.

  3. Application of the Pareto principle to identify and address drug-therapy safety issues.

    Science.gov (United States)

    Müller, Fabian; Dormann, Harald; Pfistermeister, Barbara; Sonst, Anja; Patapovas, Andrius; Vogler, Renate; Hartmann, Nina; Plank-Kiegele, Bettina; Kirchner, Melanie; Bürkle, Thomas; Maas, Renke

    2014-06-01

    Adverse drug events (ADE) and medication errors (ME) are common causes of morbidity in patients presenting at emergency departments (ED). Recognition of ADE as being drug related and prevention of ME are key to enhancing pharmacotherapy safety in ED. We assessed the applicability of the Pareto principle (~80 % of effects result from 20 % of causes) to address locally relevant problems of drug therapy. In 752 cases consecutively admitted to the nontraumatic ED of a major regional hospital, ADE, ME, contributing drugs, preventability, and detection rates of ADE by ED staff were investigated. Symptoms, errors, and drugs were sorted by frequency in order to apply the Pareto principle. In total, 242 ADE were observed, and 148 (61.2 %) were assessed as preventable. ADE contributed to 110 inpatient hospitalizations. The ten most frequent symptoms were causally involved in 88 (80.0 %) inpatient hospitalizations. Only 45 (18.6 %) ADE were recognized as drug-related problems until discharge from the ED. A limited set of 33 drugs accounted for 184 (76.0 %) ADE; ME contributed to 57 ADE. Frequency-based listing of ADE, ME, and drugs involved allowed identification of the most relevant problems and development of easily to implement safety measures, such as wall and pocket charts. The Pareto principle provides a method for identifying the locally most relevant ADE, ME, and involved drugs. This permits subsequent development of interventions to increase patient safety in the ED admission process that best suit local needs.

  4. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A. Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-01

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele’s (ZDT’s) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  5. General regularities of Sr 90 distribution in system soil-plant under natural conditions

    International Nuclear Information System (INIS)

    Gudeliene, I.; Marchiulioniene, D.; Petroshius, R.

    2006-01-01

    Sr 90 distribution in system 'soil - underground part of plant - aboveground part of plant' was investigated. It was determined that Sr 90 activity concentration in underground and aboveground part of plants and in mosses was not dependent on its activity concentration in soil. There was direct dependence of Sr 90 activity concentration in aboveground on underground parts of plants. Sr 90 transfer factor from soil to underground part of plants and mosses was directly dependent on this radionuclide activity concentration in them. (authors)

  6. Statistical analysis of latent generalized correlation matrix estimation in transelliptical distribution

    OpenAIRE

    Han, Fang; Liu, Han

    2016-01-01

    Correlation matrices play a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, it is not an effective estimator when facing heavy-tailed distributions. As a robust alternative, Han and Liu [J. Am. Stat. Assoc. 109 (2015) 275-2...

  7. A two-component generalized extreme value distribution for precipitation frequency analysis

    Czech Academy of Sciences Publication Activity Database

    Rulfová, Zuzana; Buishand, A.; Roth, M.; Kyselý, Jan

    2016-01-01

    Roč. 534, March (2016), s. 659-668 ISSN 0022-1694 R&D Projects: GA ČR(CZ) GA14-18675S Institutional support: RVO:68378289 Keywords : precipitation extremes * two-component extreme value distribution * regional frequency analysis * convective precipitation * stratiform precipitation * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.483, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022169416000500

  8. Bayesian inference for generalized linear mixed model based on the multivariate t distribution in population pharmacokinetic study.

    Science.gov (United States)

    Yan, Fang-Rong; Huang, Yuan; Liu, Jun-Lin; Lu, Tao; Lin, Jin-Guan

    2013-01-01

    This article provides a fully bayesian approach for modeling of single-dose and complete pharmacokinetic data in a population pharmacokinetic (PK) model. To overcome the impact of outliers and the difficulty of computation, a generalized linear model is chosen with the hypothesis that the errors follow a multivariate Student t distribution which is a heavy-tailed distribution. The aim of this study is to investigate and implement the performance of the multivariate t distribution to analyze population pharmacokinetic data. Bayesian predictive inferences and the Metropolis-Hastings algorithm schemes are used to process the intractable posterior integration. The precision and accuracy of the proposed model are illustrated by the simulating data and a real example of theophylline data.

  9. On the distribution of discounted loss reserves using generalized linear models

    NARCIS (Netherlands)

    Hoedemakers, T.; Beirlant, J.; Goovaerts, M.J.; Dhaene, J.

    2005-01-01

    Renshaw and Verrall [11] specified the generalized linear model (GLM) underlying the chain-ladder technique and suggested some other GLMs which might be useful in claims reserving. The purpose of this paper is to construct bounds for the discounted loss reserve within the framework of GLMs. Exact

  10. Skewness of the generalized centrifugal force divergence for a joint normal distribution of strain and vorticity components

    Science.gov (United States)

    Hua, Bach Lien

    1994-09-01

    This note attempts to connect the skewness of the probability distribution function (PDF) of pressure, which is commonly observed in two-dimensional turbulence, to differences in the geometry of the strain and vorticity fields. This paper illustrates analytically the respective roles of strain and vorticity in shaping the PDF of pressure, in the particular case of a joint normal distribution of velocity gradients. The latter assumption is not valid in general in direct numerical simulations (DNS) of two-dimensional turbulence but may apply to geostrophic turbulence in presence of a differential rotation (β effect). In essence, minus the Laplacian of pressure is the difference of squared strain and vorticity, a quantity which is named the generalized centrifugal force divergence (GCFD). Square strain and vorticity distributions follow chi-square statistics with unequal numbers of degrees of freedom, when one assumes a joint normal distribution of their components. Squared strain has two degrees of freedom and squared vorticity only one, thereby causing a skewness of the PDF of GCFD and hence of pressure.

  11. Distributing learning over time: the spacing effect in children's acquisition and generalization of science concepts.

    Science.gov (United States)

    Vlach, Haley A; Sandhofer, Catherine M

    2012-01-01

    The spacing effect describes the robust finding that long-term learning is promoted when learning events are spaced out in time rather than presented in immediate succession. Studies of the spacing effect have focused on memory processes rather than for other types of learning, such as the acquisition and generalization of new concepts. In this study, early elementary school children (5- to 7-year-olds; N = 36) were presented with science lessons on 1 of 3 schedules: massed, clumped, and spaced. The results revealed that spacing lessons out in time resulted in higher generalization performance for both simple and complex concepts. Spaced learning schedules promote several types of learning, strengthening the implications of the spacing effect for educational practices and curriculum. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  12. Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution

    Science.gov (United States)

    Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.

    1987-01-01

    Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.

  13. Estimation of aerosol particle number distributions with Kalman Filtering – Part 1: Theory, general aspects and statistical validity

    Directory of Open Access Journals (Sweden)

    T. Viskari

    2012-12-01

    Full Text Available Aerosol characteristics can be measured with different instruments providing observations that are not trivially inter-comparable. Extended Kalman Filter (EKF is introduced here as a method to estimate aerosol particle number size distributions from multiple simultaneous observations. The focus here in Part 1 of the work was on general aspects of EKF in the context of Differential Mobility Particle Sizer (DMPS measurements. Additional instruments and their implementations are discussed in Part 2 of the work. University of Helsinki Multi-component Aerosol model (UHMA is used to propagate the size distribution in time. At each observation time (10 min apart, the time evolved state is updated with the raw particle mobility distributions, measured with two DMPS systems. EKF approach was validated by calculating the bias and the standard deviation for the estimated size distributions with respect to the raw measurements. These were compared to corresponding bias and standard deviation values for particle number size distributions obtained from raw measurements by a inversion of the instrument kernel matrix method. Despite the assumptions made in the EKF implementation, EKF was found to be more accurate than the inversion of the instrument kernel matrix in terms of bias, and compatible in terms of standard deviation. Potential further improvements of the EKF implementation are discussed.

  14. Ion and electron beam effects on kinetic Alfven wave with general loss-cone distribution function-kinetic approach

    International Nuclear Information System (INIS)

    Shukla, Nidhi; Mishra, Ruchi; Varma, P; Tiwari, M S

    2008-01-01

    This work studies the effect of ion and electron beam on kinetic Alfven wave (KAW) with general loss-cone distribution function. The kinetic theory has been adopted to evaluate the dispersion relation and damping rate of the wave in the presence of loss-cone distribution indices J. The variations in wave frequency ω and damping rate with perpendicular wave number k perpendicular ρ i (k perpendicular is perpendicular wave number and ρ i is ion gyroradius) and parallel wave number k parallel are studied. It is found that the distribution index J and ion beam velocity enhance the wave frequency at lower k perpendicular ρ i , whereas the electron beam velocity enhances the wave frequency at higher k perpendicular ρ i . The calculated values of frequency correspond to the observed values in the range 0.1-4 Hz. Increase in damping rate due to higher distribution indices J and ion beam velocity is observed. The effect of electron beam is to reduce the damping rate at higher k perpendicular ρ i . The plasma parameters appropriate to plasma sheet boundary layer are used. The results may explain the transfer of Poynting flux from the magnetosphere to the ionosphere. It is also found that in the presence of the loss-cone distribution function the ion beam becomes a sensitive parameter to reduce the Poynting flux of KAW propagating towards the ionosphere

  15. Distributed cerebellar plasticity implements generalized multiple-scale memory components in real-robot sensorimotor tasks

    Directory of Open Access Journals (Sweden)

    Claudia eCasellato

    2015-02-01

    Full Text Available The cerebellum plays a crucial role in motor learning and it acts as a predictive controller. Modeling it and embedding it into sensorimotor tasks allows us to create functional links between plasticity mechanisms, neural circuits and behavioral learning. Moreover, if applied to real-time control of a neurorobot, the cerebellar model has to deal with a real noisy and changing environment, thus showing its robustness and effectiveness in learning. A biologically inspired cerebellar model with distributed plasticity, both at cortical and nuclear sites, has been used. Two cerebellum-mediated paradigms have been designed: an associative Pavlovian task and a vestibulo-ocular reflex, with multiple sessions of acquisition and extinction and with different stimuli and perturbation patterns. The cerebellar controller succeeded to generate conditioned responses and finely tuned eye movement compensation, thus reproducing human-like behaviors. Through a productive plasticity transfer from cortical to nuclear sites, the distributed cerebellar controller showed in both tasks the capability to optimize learning on multiple time-scales, to store motor memory and to effectively adapt to dynamic ranges of stimuli.

  16. Exact Solutions of Fragmentation Equations with General Fragmentation Rates and Separable Particles Distribution Kernels

    Directory of Open Access Journals (Sweden)

    S. C. Oukouomi Noutchie

    2014-01-01

    Full Text Available We make use of Laplace transform techniques and the method of characteristics to solve fragmentation equations explicitly. Our result is a breakthrough in the analysis of pure fragmentation equations as this is the first instance where an exact solution is provided for the fragmentation evolution equation with general fragmentation rates. This paper is the key for resolving most of the open problems in fragmentation theory including “shattering” and the sudden appearance of infinitely many particles in some systems with initial finite particles number.

  17. Tight finite-key analysis for passive decoy-state quantum key distribution under general attacks

    Science.gov (United States)

    Zhou, Chun; Bao, Wan-Su; Li, Hong-Wei; Wang, Yang; Li, Yuan; Yin, Zhen-Qiang; Chen, Wei; Han, Zheng-Fu

    2014-05-01

    For quantum key distribution (QKD) using spontaneous parametric-down-conversion sources (SPDCSs), the passive decoy-state protocol has been proved to be efficiently close to the theoretical limit of an infinite decoy-state protocol. In this paper, we apply a tight finite-key analysis for the passive decoy-state QKD using SPDCSs. Combining the security bound based on the uncertainty principle with the passive decoy-state protocol, a concise and stringent formula for calculating the key generation rate for QKD using SPDCSs is presented. The simulation shows that the secure distance under our formula can reach up to 182 km when the number of sifted data is 1010. Our results also indicate that, under the same deviation of statistical fluctuation due to finite-size effects, the passive decoy-state QKD with SPDCSs can perform as well as the active decoy-state QKD with a weak coherent source.

  18. Oscillation for equations with positive and negative coefficients and with distributed delay I: General results

    Directory of Open Access Journals (Sweden)

    Leonid Berezansky

    2003-02-01

    Full Text Available We study a scalar delay differential equation with a bounded distributed delay, $$ dot{x}(t+ int_{h(t}^t x(s,d_s R(t,s - int_{g(t}^t x(s,d_s T(t,s=0, $$ where $R(t,s$, $T(t,s$ are nonnegative nondecreasing in $s$ for any $t$, $$ R(t,h(t=T(t,g(t=0, quad R(t,s geq T(t,s. $$ We establish a connection between non-oscillation of this differential equation and the corresponding differential inequalities, and between positiveness of the fundamental function and the existence of a nonnegative solution for a nonlinear integral inequality that constructed explicitly. We also present comparison theorems, and explicit non-oscillation and oscillation results. In a separate publication (part II, we will consider applications of this theory to differential equations with several concentrated delays, integrodifferential, and mixed equations.

  19. Financial Intermediation, Moral Hazard and Pareto Inferior Trade

    DEFF Research Database (Denmark)

    Olai Hansen, Bodil; Keiding, Hans

    2004-01-01

    We consider a simple model of international trade under uncertainty, whereproduction takes time and is subject to uncertainty. The riskiness of production dependson the choices of the producers, not observable to the general public, and these choicesare influenced by the availability and cost...... the model, the market may not be able to supply credits to one of the countries.The introduction of financial intermediaries with the ability to control the debtorsmay change this situation in a direction which is welfare improving (in a suitable sense)by increasing expected output in the country with high...... interest rates, while opening upfor new problems of asymmetric information with respect to the monitoring activity ofthe banks.Keywords: Capital outflow, financial intermediaries, moral hazardJEL classification: F36, D92, E44...

  20. A Knowledge-Informed and Pareto-Based Artificial Bee Colony Optimization Algorithm for Multi-Objective Land-Use Allocation

    Directory of Open Access Journals (Sweden)

    Lina Yang

    2018-02-01

    Full Text Available Land-use allocation is of great significance in urban development. This type of allocation is usually considered to be a complex multi-objective spatial optimization problem, whose optimized result is a set of Pareto-optimal solutions (Pareto front reflecting different tradeoffs in several objectives. However, obtaining a Pareto front is a challenging task, and the Pareto front obtained by state-of-the-art algorithms is still not sufficient. To achieve better Pareto solutions, taking the grid-representative land-use allocation problem with two objectives as an example, an artificial bee colony optimization algorithm for multi-objective land-use allocation (ABC-MOLA is proposed. In this algorithm, the traditional ABC’s search direction guiding scheme and solution maintaining process are modified. In addition, a knowledge-informed neighborhood search strategy, which utilizes the auxiliary knowledge of natural geography and spatial structures to facilitate the neighborhood spatial search around each solution, is developed to further improve the Pareto front’s quality. A series of comparison experiments (a simulated experiment with small data volume and a real-world data experiment for a large area shows that all the Pareto fronts obtained by ABC-MOLA totally dominate the Pareto fronts by other algorithms, which demonstrates ABC-MOLA’s effectiveness in achieving Pareto fronts of high quality.

  1. A Coupled Ocean General Circulation, Biogeochemical, and Radiative Model of the Global Oceans: Seasonal Distributions of Ocean Chlorophyll and Nutrients

    Science.gov (United States)

    Gregg, Watson W.; Busalacchi, Antonio (Technical Monitor)

    2000-01-01

    A coupled ocean general circulation, biogeochemical, and radiative model was constructed to evaluate and understand the nature of seasonal variability of chlorophyll and nutrients in the global oceans. Biogeochemical processes in the model are determined from the influences of circulation and turbulence dynamics, irradiance availability. and the interactions among three functional phytoplankton groups (diatoms. chlorophytes, and picoplankton) and three nutrients (nitrate, ammonium, and silicate). Basin scale (greater than 1000 km) model chlorophyll results are in overall agreement with CZCS pigments in many global regions. Seasonal variability observed in the CZCS is also represented in the model. Synoptic scale (100-1000 km) comparisons of imagery are generally in conformance although occasional departures are apparent. Model nitrate distributions agree with in situ data, including seasonal dynamics, except for the equatorial Atlantic. The overall agreement of the model with satellite and in situ data sources indicates that the model dynamics offer a reasonably realistic simulation of phytoplankton and nutrient dynamics on synoptic scales. This is especially true given that initial conditions are homogenous chlorophyll fields. The success of the model in producing a reasonable representation of chlorophyll and nutrient distributions and seasonal variability in the global oceans is attributed to the application of a generalized, processes-driven approach as opposed to regional parameterization and the existence of multiple phytoplankton groups with different physiological and physical properties. These factors enable the model to simultaneously represent many aspects of the great diversity of physical, biological, chemical, and radiative environments encountered in the global oceans.

  2. SU-F-J-105: Towards a Novel Treatment Planning Pipeline Delivering Pareto- Optimal Plans While Enabling Inter- and Intrafraction Plan Adaptation

    International Nuclear Information System (INIS)

    Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B; Breedveld, S; Sharfo, A; Heijmen, B

    2016-01-01

    Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan

  3. SU-F-J-105: Towards a Novel Treatment Planning Pipeline Delivering Pareto- Optimal Plans While Enabling Inter- and Intrafraction Plan Adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B [University Medical Center Utrecht, Utrecht (Netherlands); Breedveld, S; Sharfo, A; Heijmen, B [Erasmus University Medical Center Rotterdam, Rotterdam (Netherlands)

    2016-06-15

    Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan

  4. Financial Intermediation, Moral Hazard and Pareto Inferior Trade

    DEFF Research Database (Denmark)

    Olai Hansen, Bodil; Keiding, Hans

    2004-01-01

    We consider a simple model of international trade under uncertainty, whereproduction takes time and is subject to uncertainty. The riskiness of production dependson the choices of the producers, not observable to the general public, and these choicesare influenced by the availability and cost of ...... interest rates, while opening upfor new problems of asymmetric information with respect to the monitoring activity ofthe banks.Keywords: Capital outflow, financial intermediaries, moral hazardJEL classification: F36, D92, E44...... of credit. If investment is financed by abond market, then a situation may arise where otherwise identical countries end upwith different levels of interest and different choices of technique, which again impliesdifferences in achieved level of welfare. Under suitable conditions on the parametersof...... the model, the market may not be able to supply credits to one of the countries.The introduction of financial intermediaries with the ability to control the debtorsmay change this situation in a direction which is welfare improving (in a suitable sense)by increasing expected output in the country with high...

  5. Photoproduction of a πρT pair with a large invariant mass and transversity generalized parton distribution

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Segond, M.; Szymanowski, L.; Wallon, S.

    2010-01-01

    The chiral-odd transversity generalized parton distributions (GPDs) of the nucleon can be accessed experimentally through the exclusive photoproduction process γ+N→π+ρ+N ' , in the kinematics where the meson pair has a large invariant mass and the final nucleon has a small transverse momentum, provided the vector meson is produced in a transversally polarized state. We calculate perturbatively the scattering amplitude at leading order in α s . We build a simple model for the dominant transversity GPD H T (x,ξ,t) based on the concept of double distribution. We estimate the unpolarized differential cross section for this process in the kinematics of the Jlab and Compass experiments. Counting rates show that the experiment looks feasible with the real photon beam characteristics expected at JLab-12 GeV, and with the quasi real photon beam in the Compass experiment.

  6. Photoproduction of a pirho{sub T} pair with a large invariant mass and transversity generalized parton distribution

    Energy Technology Data Exchange (ETDEWEB)

    El Beiyad, M. [Centre de Physique Theorique, Ecole Polytechnique, CNRS, 91128 Palaiseau (France); LPT, Universite d' Orsay, CNRS, 91404 Orsay (France); Pire, B. [Centre de Physique Theorique, Ecole Polytechnique, CNRS, 91128 Palaiseau (France); Segond, M. [Institut fuer Theoretische Physik, Universitaet Leipzig, D-04009 Leipzig (Germany); Szymanowski, L. [Centre de Physique Theorique, Ecole Polytechnique, CNRS, 91128 Palaiseau (France); Soltan Institute for Nuclear Studies, Warsaw (Poland); Wallon, S., E-mail: Samuel.Wallon@th.u-psud.f [LPT, Universite d' Orsay, CNRS, 91404 Orsay (France); UPMC, Univ. Paris 06, Faculte de physique, 4 place Jussieu, 75252 Paris Cedex 05 (France)

    2010-05-03

    The chiral-odd transversity generalized parton distributions (GPDs) of the nucleon can be accessed experimentally through the exclusive photoproduction process gamma+N->pi+rho+N{sup '}, in the kinematics where the meson pair has a large invariant mass and the final nucleon has a small transverse momentum, provided the vector meson is produced in a transversally polarized state. We calculate perturbatively the scattering amplitude at leading order in alpha{sub s}. We build a simple model for the dominant transversity GPD H{sub T}(x,xi,t) based on the concept of double distribution. We estimate the unpolarized differential cross section for this process in the kinematics of the Jlab and Compass experiments. Counting rates show that the experiment looks feasible with the real photon beam characteristics expected at JLab-12 GeV, and with the quasi real photon beam in the Compass experiment.

  7. POPULATION GROWTH AND PREFERENCE CHANGE IN A GENERALIZED SOLOW GROWTH MODEL WITH GENDER TIME DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Wei-Bin Zhang

    2016-09-01

    Full Text Available The study builds a model of dynamic interactions between the birth rate, the mortality rate, the population, wealth accumulation, time distribution between work, leisure and children caring, habit formation and preference change. The production technology and markets are built on the Solow growth model. We base our modeling the population dynamics on the Haavelmo population model and the Barro-Becker fertility choice model. This study takes account of habit formation and preference change. Although it is influenced by the Ramsey growth theory with time preference and habit formation, it uses Zhang’s approach to the household with habit formation and preference change. We synthesize different dynamic forces in a compact framework, using the utility function proposed by Zhang. Analytically, we focus on transitional processes as well as economic equilibrium. As the economic system is given by autonomous nonlinear differential equations, it is not easy to analyze its behavior. We simulate the model to demonstrate the existence of an equilibrium point and plot the motion of the dynamic system. We examine the effects of changes in weights given to the habit stock of children, the wife’s wage rate having negative impact on the propensity to have children, the wife weighing less the habit stock of leisure time, the wife’s habit stock of leisure time having negative impact on the husband’s propensity to use leisure time, the wife’s wage rate having negative impact on the husband’s propensity to use leisure time, woman’s human capital being improved, a rise in the total factor productivity, and the mother spending more time on each child fostering.

  8. Insight of the distribution and general characters of uranium deposits in the world (except France)

    International Nuclear Information System (INIS)

    Gangloff, A.

    1956-01-01

    It gives a large insight of uranium deposits and general characters of uranium deposits on the planet (except France). It gives a review of the mineralized area of the main uranium producers country with a geographic and geologic recall. Moreover, it brings together all the important prospecting results from countries which have presented a report at the international conference on the pacific uses of atomic energy in geneva (8-20 august 1955). All these countries are cited except France. It described not only the payable deposits as each deposit brings interesting indications for future prospecting and might also become payable in the future. It started with the geological survey of USA and Canada and the geographic description of their different uranium deposit sites as both country present the largest uranium resources. In the same way, geographic and geological surveys of South Africa, Democratic Republic of Congo, Australia, India, Brazil, Argentina, Rhodesia, Mozambia, Sweden, Norway, United kingdom, Portugal, Yugoslavia, Italy, Austria and Switzerland are described. (M.P.)

  9. La narrazione dell’azione sociale: spunti dal Trattato di Vilfredo Pareto

    Directory of Open Access Journals (Sweden)

    Ilaria Riccioni

    2017-08-01

    Full Text Available La rilettura dei classici porta con sé sempre una duplice operazione: da una parte un ritorno a riflessioni, ritmi, storicità che spesso sembrano già superate; dall’altra la riscoperta delle origini di fenomeni contemporanei da punti di vista che ne delineano le interconnessioni profonde, non più visibili allo stato di avanzamento in cui le osserviamo oggi. Tale maggiore chiarezza è forse dovuta al fatto che ogni fenomeno nella sua fase aurorale è più chiaramente identificabile rispetto alle sue fasi successive, dove le caratteristiche primarie tendono a stemperarsi nelle cifre dominanti della contemporaneità, perdendosi nelle pratiche quotidiane che ne celano la provenienza. Se la sociologia è un processo di conoscenza della realtà dei fenomeni, il punto centrale della scienza sociale va distinto tra quelle scienze che schematizzano il reale in equazioni formali funzionali e funzionanti, il sistema economico, normativo, e le scienze sociali che si occupano della realtà e della sua complessità, che in quanto scienze si devono occupare non tanto di ciò che la realtà deve essere, bensì di ciò che la realtà è, di come si pone e di come manifesta i movimenti desideranti e profondi del vivere collettivo oltre il sistema che ne gestisce il funzionamento. Il punto che Pareto sembra scorgere, con estrema lucidità, è la necessità di ribaltare l’importanza della logica economica nell’organizzazione sociale da scienza che detta la realtà a scienza che propone uno schema di gestione di essa: da essa si cerca di dettare la realtà, ma l’economia, dal greco moderno Oikòs, Oikòsgeneia (casa e generazione, il termine utilizzato per definire l’unità famigliare non è di fatto “la realtà”, sembra dirci Pareto in più digressioni, bensì l’arte e la scienza della gestione di unità familiari e produttive. La realtà rimane in ombra e non può che essere “avvicinata” da una scienza che ne registri, ed eventualmente

  10. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    Science.gov (United States)

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-09-01

    Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.

  12. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  13. Characterization of distributions by conditional expectation of record values

    Directory of Open Access Journals (Sweden)

    A.H. Khan

    2016-01-01

    Full Text Available A family of continuous probability distributions has been characterized by two conditional expectations of record statistics conditioned on a non-adjacent record value. Besides various deductions, this work extends the result of Lee [8] in which Pareto distribution has been characterized.

  14. Distributions of Journal Citations in Small Collections of Reading Research.

    Science.gov (United States)

    Mayes, Bea

    The distribution of reading-research citations was investigated in three populations of journals. The rule of Pareto-like distribution was confirmed as appropriate for determining the number of journals that would contribute half the citations in populations of 26 to 112 journals. In populations of 42 to 112 journals, 24% to 29% of the…

  15. Choosing the optimal Pareto composition of the charge material for the manufacture of composite blanks

    Science.gov (United States)

    Zalazinsky, A. G.; Kryuchkov, D. I.; Nesterenko, A. V.; Titov, V. G.

    2017-12-01

    The results of an experimental study of the mechanical properties of pressed and sintered briquettes consisting of powders obtained from a high-strength VT-22 titanium alloy by plasma spraying with additives of PTM-1 titanium powder obtained by the hydride-calcium method and powder of PV-N70Yu30 nickel-aluminum alloy are presented. The task is set for the choice of an optimal charge material composition of a composite material providing the required mechanical characteristics and cost of semi-finished products and items. Pareto optimal values for the composition of the composite material charge have been obtained.

  16. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  17. Pareto law of the expenditure of a person in convenience stores

    Science.gov (United States)

    Mizuno, Takayuki; Toriyama, Masahiro; Terano, Takao; Takayasu, Misako

    2008-06-01

    We study the statistical laws of the expenditure of a person in convenience stores by analyzing around 100 million receipts. The density function of expenditure exhibits a fat tail that follows a power law. Using the Lorenz curve, the Gini coefficient is estimated to be 0.70; this implies that loyal customers contribute significantly to a store’s sales. We observe the Pareto principle where both the top 25% and 2% of the customers account for 80% and 25% of the store’s sales, respectively.

  18. Seasonal Distributions of Global Ocean Chlorophyll and Nutrients: Analysis with a Coupled Ocean General Circulation Biogeochemical, and Radiative Model

    Science.gov (United States)

    Gregg, Watson W.

    1999-01-01

    A coupled general ocean circulation, biogeochemical, and radiative model was constructed to evaluate and understand the nature of seasonal variability of chlorophyll and nutrients in the global oceans. The model is driven by climatological meteorological conditions, cloud cover, and sea surface temperature. Biogeochemical processes in the model are determined from the influences of circulation and turbulence dynamics, irradiance availability, and the interactions among three functional phytoplankton groups (diatoms, chorophytes, and picoplankton) and three nutrient groups (nitrate, ammonium, and silicate). Phytoplankton groups are initialized as homogeneous fields horizontally and vertically, and allowed to distribute themselves according to the prevailing conditions. Basin-scale model chlorophyll results are in very good agreement with CZCS pigments in virtually every global region. Seasonal variability observed in the CZCS is also well represented in the model. Synoptic scale (100-1000 km) comparisons of imagery are also in good conformance, although occasional departures are apparent. Agreement of nitrate distributions with in situ data is even better, including seasonal dynamics, except for the equatorial Atlantic. The good agreement of the model with satellite and in situ data sources indicates that the model dynamics realistically simulate phytoplankton and nutrient dynamics on synoptic scales. This is especially true given that initial conditions are homogenous chlorophyll fields. The success of the model in producing a reasonable representation of chlorophyll and nutrient distributions and seasonal variability in the global oceans is attributed to the application of a generalized, processes-driven approach as opposed to regional parameterization, and the existence of multiple phytoplankton groups with different physiological and physical properties. These factors enable the model to simultaneously represent the great diversity of physical, biological

  19. Multiple Criteria Decision Making by Generalized Data Envelopment Analysis Introducing Aspiration Level Method

    International Nuclear Information System (INIS)

    Yun, Yeboon; Arakawa, Masao; Hiroshi, Ishikawa; Nakayama, Hirotaka

    2002-01-01

    It has been proved in problems with 2-objective functions that genetic algorithms (GAs) are well utilized for generating Pareto optimal solutions, and then decision making can be easily performed on the basis of visualized Pareto optimal solutions. However, GAs are difficult to visualize Pareto optimal solutions in cases in which the number of objective function is more than 4. Hence, it is trouble some to grasp the trade-off among many objective functions, and decision makers hesitate to choose a final solution from a number of Pareto optimal solutions. In order to solve these problems, we suggest an aspiration level approach to the method using the generalized data envelopment analysis and GAs. We show that the proposed method supports decision makers to choose their desirable solution from many Pareto optimal solutions. Furthermore, it will be seen that engineering design can be effectively done by the proposed method, which makes generation of several Pareto optimal solutions close to the aspiration level and trade-off analysis easily

  20. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games - Replaced by CentER DP 2011-041

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2010-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for an N player cooperative infinite horizon differential game. Firstly, we write the problem of finding Pareto candidates as solving N constrained optimal control subproblems. We derive some