WorldWideScience

Sample records for weibull distributions

  1. A MULTIVARIATE WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Cheng Lee

    2010-07-01

    Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.

  2. Transmuted Complementary Weibull Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Z. A…fify

    2014-12-01

    Full Text Available This paper provides a new generalization of the complementary Weibull geometric distribution that introduced by Tojeiro et al. (2014, using the quadratic rank transmutation map studied by Shaw and Buckley (2007. The new distribution is referred to as transmuted complementary Weibull geometric distribution (TCWGD. The TCWG distribution includes as special cases the complementary Weibull geometric distribution (CWGD, complementary exponential geometric distribution(CEGD,Weibull distribution (WD and exponential distribution (ED. Various structural properties of the new distribution including moments, quantiles, moment generating function and RØnyi entropy of the subject distribution are derived. We proposed the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the ‡exibility of the transmuted version versus the complementary Weibull geometric distribution.

  3. The Weibull distribution a handbook

    CERN Document Server

    Rinne, Horst

    2008-01-01

    The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T

  4. The Transmuted Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Faton Merovci

    2014-05-01

    Full Text Available A generalization of the generalized inverse Weibull distribution the so-called transmuted generalized inverse Weibull distribution is proposed and studied. We will use the quadratic rank transmutation map (QRTM in order to generate a flexible family of probability distributions taking the generalized inverseWeibull distribution as the base value distribution by introducing a new parameter that would offer more distributional flexibility. Various structural properties including explicit expressions for the moments, quantiles, and moment generating function of the new distribution are derived. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set are used to compare the flexibility of the transmuted version versus the generalized inverse Weibull distribution.

  5. Censored Weibull Distributed Data in Experimental Design

    OpenAIRE

    Støtvig, Jeanett Gunneklev

    2014-01-01

    Give an introduction to experimental design. Investigate how four methods handle Weibull distributed censored data, where the four methods are the quick and dirty method, the maximum likelihood method, single imputation and multiple imputation.

  6. A CLASS OF WEIGHTED WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Saman Shahbaz

    2010-07-01

    Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.

  7. Weibull Distributions for the Preterm Delivery

    Directory of Open Access Journals (Sweden)

    Kavitha, N

    2014-06-01

    Full Text Available The purposes of this study are to evaluate the levels of CRH at pregnancy by using Weibull distributions. Also this study found the rate of change in placental CRH and the level of maternal cortisol in preterm delivery by the mathematical formulas.

  8. Transmuted New Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Shuaib Khan

    2017-06-01

    Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.

  9. Modeling particle size distributions by the Weibull distribution function

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Zhigang (Rogers Tool Works, Rogers, AR (United States)); Patterson, B.R.; Turner, M.E. Jr (Univ. of Alabama, Birmingham, AL (United States))

    1993-10-01

    A method is proposed for modeling two- and three-dimensional particle size distributions using the Weibull distribution function. Experimental results show that, for tungsten particles in liquid phase sintered W-14Ni-6Fe, the experimental cumulative section size distributions were well fit by the Weibull probability function, which can also be used to compute the corresponding relative frequency distributions. Modeling the two-dimensional section size distributions facilitates the use of the Saltykov or other methods for unfolding three-dimensional (3-D) size distributions with minimal irregularities. Fitting the unfolded cumulative 3-D particle size distribution with the Weibull function enables computation of the statistical distribution parameters from the parameters of the fit Weibull function.

  10. ZERODUR strength modeling with Weibull statistical distributions

    Science.gov (United States)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a

  11. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  12. Distributed Fuzzy CFAR Detection for Weibull Clutter

    Science.gov (United States)

    Zaimbashi, Amir; Taban, Mohammad Reza; Nayebi, Mohammad Mehdi

    In Distributed detection systems, restricting the output of the local decision to one bit certainly implies a substantial information loss. In this paper, we consider the fuzzy detection, which uses a function called membership function for mapping the observation space of each local detector to a value between 0 and 1, indicating the degree of assurance about presence or absence of a signal. In this case, we examine the problem of distributed Maximum Likelihood (ML) and Order Statistic (OS) constant false alarm rate (CFAR) detections using fuzzy fusion rules such as “Algebraic Product” (AP), “Algebraic Sum” (AS), “Union” (Un) and “Intersection” (IS) in the fusion centre. For the Weibull clutter, the expression of the membership function based on the ML or OS CFAR processors in the local detectors is also obtained. For comparison, we consider a binary distributed detector, which uses the Maximum Likelihood and Algebraic Product (MLAP) or Order Statistic and Algebraic Product (OSAP) CFAR processors as the local detectors. In homogenous and non homogenous situations, multiple targets or clutter edge, the performances of the fuzzy and binary distributed detectors are analyzed and compared. The simulation results indicate the superior and robust performance of the distributed systems using fuzzy detection in the homogenous and non homogenous situations.

  13. Weibull model of multiplicity distribution in hadron-hadron collisions

    Science.gov (United States)

    Dash, Sadhana; Nandi, Basanta K.; Sett, Priyanka

    2016-06-01

    We introduce the use of the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes that involve fragmentation processes. This provides a natural connection to the available state-of-the-art models for multiparticle production in hadron-hadron collisions, which involve QCD parton fragmentation and hadronization. The Weibull distribution describes the multiplicity data at the most recent LHC energies better than the single negative binomial distribution.

  14. A Weibull distribution accrual failure detector for cloud computing.

    Science.gov (United States)

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  15. On Generalized Upper(kRecord Values From Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Jerin Paul

    2015-09-01

    Full Text Available In this paper we study the generalized upper(krecord values arising from Weibull distribution. Expressions for the moments and product moments of those generalized upper(krecord values  are derived. Some properties of generalized upper(krecord values which characterize the Weibull distribution  have been established. Also some distributional properties of generalized upper(krecord values arising from Weibull distribution are considered and used for suggesting an estimator for the shape parameter of Weibull distribution. The location and scale parameters are estimated using the Best Linear Unbiased Estimation procedure. Prediction of a future record using Best Linear Unbiased Predictor has been studied. A real life data is used to illustrate the results generated in this work.

  16. comparison of estimation methods for fitting weibull distribution to ...

    African Journals Online (AJOL)

    Tersor

    JOURNAL OF RESEARCH IN FORESTRY, WILDLIFE AND ENVIRONMENT VOLUME 7, No.2 SEPTEMBER, 2015. ... method was more accurate in fitting the Weibull distribution to the natural stand. ... appropriate for mixed age group.

  17. Weibull model of Multiplicity Distribution in hadron-hadron collisions

    CERN Document Server

    Dash, Sadhana

    2014-01-01

    We introduce the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes involving fragmentation processes. This gives a natural connection to the available state-of-the-art models for multi-particle production in hadron hadron collisions involving QCD parton fragmentation and hadronization.

  18. On the Weibull distribution for wind energy assessment

    DEFF Research Database (Denmark)

    Batchvarova, Ekaterina; Gryning, Sven-Erik

    2014-01-01

    The two parameter Weibull distribution is traditionally used to describe the long term fluctuations in the wind speed as part of the theoretical framework for wind energy assessment of wind farms. The Weibull distribution is described by a shape and a scale parameter. Here, based on recent long......-term measurements performed by a wind lidar, the vertical profile of the shape parameter will be discussed for a sub-urban site, a coastal site and a marine site. The profile of the shape parameter was found to be substantially different over land and sea. A parameterization of the vertical behavior of the shape...

  19. ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA

    Institute of Scientific and Technical Information of China (English)

    XUE Hongqi; SONG Lixin

    2002-01-01

    A grouped data model for Weibull distribution is considered. Under mild con-ditions, the maximum likelihood estimators(MLE) are shown to be identifiable, strongly consistent, asymptotically normal, and satisfy the law of iterated logarithm. Newton iter- ation algorithm is also considered, which converges to the unique solution of the likelihood equation. Moreover, we extend these results to a random case.

  20. Reliability Analysis of DOOF for Weibull Distribution

    Institute of Scientific and Technical Information of China (English)

    陈文华; 崔杰; 樊小燕; 卢献彪; 相平

    2003-01-01

    Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.

  1. Packing fraction of particles with a Weibull size distribution

    Science.gov (United States)

    Brouwers, H. J. H.

    2016-07-01

    This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ1, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1 - φ1)β as function of φ1 is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data.

  2. Reliability analysis of DOOF for Weibull distribution

    Institute of Scientific and Technical Information of China (English)

    陈文华; 崔杰; 樊晓燕; 卢献彪; 相平

    2003-01-01

    Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.

  3. ASYMPTOTIC PROPERTIES OF MLE FOR WEIBULL DISTRIBUTION WITH GROUPED DATA

    Institute of Scientific and Technical Information of China (English)

    XUEHongqi; SONGLixin

    2002-01-01

    A grouped data model for weibull distribution is considered.Under mild conditions .the maximum likelihood estimators(MLE)are shown to be identifiable,strongly consistent,asymptotically normal,and satisfy the law of iterated logarithm .Newton iteration algorthm is also condsidered,which converges to the unique solution of the likelihood equation.Moreover,we extend these results to a random case.

  4. A New Approach for Parameter Estimation of Mixed Weibull Distribution:A Case Study in Spindle

    Institute of Scientific and Technical Information of China (English)

    Dongwei Gu; Zhiqiong Wang; Guixiang Shen; Yingzhi Zhang; Xilu Zhao

    2016-01-01

    In order to improve the accuracy and efficiency of graphical method and maximum likelihood estimation ( MLE) in Mixed Weibull distribution parameters estimation, Graphical-GA combines the advantage of graphical method and genetic algorithm ( GA) is proposed. Firstly, with the analysis of Weibull probability paper (WPP), mixed Weibull is identified to data fitting. Secondly, the observed value of shape and scale parameters are obtained by graphical method with least square, then optimizing the parameters of mixed Weibull with GA. Thirdly, with the comparative analysis on graphical method, piecewise Weibull and two⁃Weibull, it shows graphical⁃GA mixed Weibull is the best. Finally, the spindle MTBF point estimation and interval estimation are got based on mixed Weibull distribution. The results indicate that graphical⁃GA are improved effectively and the evaluation of spindle can provide the basis for design and reliability growth.

  5. Polynomial approximations of the Normal toWeibull Distribution transformation

    Directory of Open Access Journals (Sweden)

    Andrés Feijóo

    2014-09-01

    Full Text Available Some of the tools that are generally employed in power system analysis need to use approaches based on statistical distributions for simulating the cumulative behavior of the different system devices. For example, the probabilistic load flow. The presence of wind farms in power systems has increased the use of Weibull and Rayleigh distributions among them. Not only the distributions themselves, but also satisfying certain constraints such as correlation between series of data or even autocorrelation can be of importance in the simulation. Correlated Weibull or Rayleigh distributions can be obtained by transforming correlated Normal distributions, and it can be observed that certain statistical values such as the means and the standard deviations tend to be retained when operating such transformations, although why this happens is not evident. The objective of this paper is to analyse the consequences of using such transformations. The methodology consists of comparing the results obtained by means of a direct transformation and those obtained by means of approximations based on the use of first and second degree polynomials. Simulations have been carried out with series of data which can be interpreted as wind speeds. The use of polynomial approximations gives accurate results in comparison with direct transformations and provides an approach that helps explain why the statistical values are retained during the transformations.

  6. Closed form expressions for moments of the beta Weibull distribution

    Directory of Open Access Journals (Sweden)

    Gauss M Cordeiro

    2011-06-01

    Full Text Available The beta Weibull distribution was first introduced by Famoye et al. (2005 and studied by these authors and Lee et al. (2007. However, they do not give explicit expressions for the moments. In this article, we derive explicit closed form expressions for the moments of this distribution, which generalize results available in the literature for some sub-models. We also obtain expansions for the cumulative distribution function and Rényi entropy. Further, we discuss maximum likelihood estimation and provide formulae for the elements of the expected information matrix. We also demonstrate the usefulness of this distribution on a real data set.A distribuição beta Weibull (BW foi primeiramente introduzida por Famoye et al. (2005, e estudada por estes autores e Lee et al. (2007. No entanto, eles não fornecem expressões explícitas para os momentos. Neste artigo, nós obtemos expressões explícitas, em forma fechada, para os momentos desta distribuição, o que generaliza resultados disponíveis na literatura para alguns sub-modelos. Nós obtemos expansões para a função de distribuição acumulada e entropia de Rényi. Além disso, discutimos sobre estimação por máxima verossimilhança e fornecemos fórmulaspara os elementos da matriz de informação de Fisher. Nós também mostramos a utilidade desta distribuição em um conjunto de dados reais.

  7. Discriminating between Weibull distributions and log-normal distributions emerging in branching processes

    Science.gov (United States)

    Goh, Segun; Kwon, H. W.; Choi, M. Y.

    2014-06-01

    We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

  8. A Study on The Mixture of Exponentiated-Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Adel Tawfik Elshahat

    2016-12-01

    Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.

  9. Designing a Repetitive Group Sampling Plan for Weibull Distributed Processes

    Directory of Open Access Journals (Sweden)

    Aijun Yan

    2016-01-01

    Full Text Available Acceptance sampling plans are useful tools to determine whether the submitted lots should be accepted or rejected. An efficient and economic sampling plan is very desirable for the high quality levels required by the production processes. The process capability index CL is an important quality parameter to measure the product quality. Utilizing the relationship between the CL index and the nonconforming rate, a repetitive group sampling (RGS plan based on CL index is developed in this paper when the quality characteristic follows the Weibull distribution. The optimal plan parameters of the proposed RGS plan are determined by satisfying the commonly used producer’s risk and consumer’s risk at the same time by minimizing the average sample number (ASN and then tabulated for different combinations of acceptance quality level (AQL and limiting quality level (LQL. The results show that the proposed plan has better performance than the single sampling plan in terms of ASN. Finally, the proposed RGS plan is illustrated with an industrial example.

  10. Constant-step stress accelerated life test of VFD under Weibull distribution case

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian-ping; GENG Xin-min

    2005-01-01

    Constant-step stress accelerated life test of Vacuum Fluorescent Display (VFD) was conducted with increased cathode temperature. Statistical analysis was done by applying Weibull distribution for describing the life, and Least Square Method (LSM)for estimating Weibull parameters. Self-designed special software was used to predict the VFD life. Numerical results showed that the average life of VFD is over 30000 h, that the VFD life follows Weibull distribution, and that the life-stress relationship satisfies linear Arrhenius equation completely. Accurate calculation of the key parameter enabled rapid estimation of VFD life.

  11. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    Directory of Open Access Journals (Sweden)

    B.B. Sagar

    2016-09-01

    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  12. On modeling of lifetime data using two-parameter Gamma and Weibull distributions

    NARCIS (Netherlands)

    Shanker, Rama; Shukla, Kamlesh Kumar; Shanker, Ravi; Leonida, Tekie Asehun

    2016-01-01

    The analysis and modeling of lifetime data are crucial in almost all applied sciences including medicine, insurance, engineering, behavioral sciences and finance, amongst others. The main objective of this paper is to have a comparative study of two-parameter gamma and Weibull distributions for mode

  13. An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

    Science.gov (United States)

    Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

    2005-01-01

    An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

  14. Bimodal versus Weibull wind speed distributions: an analysis of wind energy potential in La Venta, Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Jaramillo, O.A.; Borja, M.A.

    2004-07-01

    The International Standard IEC 61400-12 and other international recommendations suggest the use of the two-parameter Weibull probability distribution function (PDF) to estimate the Annual Energy Production (AEP) of a wind turbine. Most of the commercial software uses the unimodal Weibull PDF as the default option to carry out estimations of AEP, which in turn, are used to optimise wind farm layouts. Furthermore, AEP is essential data to assess the economic feasibility of a wind power project. However, in some regions of the world, the use of these widely adopted and recommended methods lead to incorrect results. This is the case for the region of La Ventosa in Mexico, where the frequency of the wind speed shows a bimodal distribution. In this work, mathematical formulations by using a Weibull PDF and a bimodal distribution are established to compare the AEP, the capacity factor and the levelised production cost for a specific wind turbine. By combining one year of wind speed data with the hypothetic power performance of the Vestas V27-225 kW wind turbine, it was found that using the Weibull PDF underestimates AEP (and thus the Capacity Factor) by about 12%. (author)

  15. Modified Weibull Distribution for Analyzing the Tensile Strength of Bamboo Fibers

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2014-12-01

    Full Text Available There is growing evidence that the standard Weibull strength distribution is not always accurate for the description of variability in tensile strength and its dependence on the gauge size of brittle fibers. In this work, a modified Weibull model by incorporating the diameter variation of bamboo fiber is proposed to investigate the effect of fiber length and diameter on the tensile strength. Fiber strengths are obtained for lengths ranging from 20 to 60 mm and diameters ranging from 196.6 to 584.3 μm through tensile tests. It is shown that as the within-fiber diameter variation increases, the fracture strength of the bamboo fiber decreases. In addition, the accuracy of using weak-link scaling predictions based on the standard and modified Weibull distribution are assessed, which indicates that the use of the modified distribution provides better correlation with the experimental data than the standard model. The result highlights the accuracy of the modified Weibull model for characterizing the strength and predicting the size dependence of bamboo fiber.

  16. Transverse Momentum Distribution in Heavy Ion Collision using q-Weibull Formalism

    CERN Document Server

    Dash, Sadhana

    2016-01-01

    We have implemented the Tsallis q-statistics in the Weibull model of particle production known as the q-Weibull distribution to describe the transverse-momentum (pT ) distribution of the charged hadrons at mid-rapidity measured at RHIC and LHC energies. The model describes the data remarkably well for the entire pT range measured in nucleus-nucleus and nucleon-nucleon collisions. The proposed distribution is based on the non-extensive Tsallis q-statistics which replaces the usual thermal equilibrium assumption of hydrodynamical models. The parameters of the distribution can be related to the various aspects of complex dynamics associated with such collision process.

  17. Probabilistic analysis of glass elements with three-parameter Weibull distribution; Analisis probabilistico de elementos de vidrio recocido mediante una distribucion triparametrica Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-10-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  18. Modelling diameter distribution of Tetraclinis articulata in Tunisia using normal and Weibull distributions with parameters depending on stand variables

    Directory of Open Access Journals (Sweden)

    Sghaier T

    2016-10-01

    Full Text Available The objective of this study was to evaluate the effectiveness of both Normal and two-parameter Weibull distributions in describing diameter distribution of Tetraclinis articulata stands in north-east Tunisia. The parameters of the Weibull function were estimated using the moments method and maximum likelihood approaches. The data used in this study came from temporary plots. The three diameter distribution models were compared firstly by estimating the parameters of the distribution directly from individual tree measurements taken in each plot (parameter estimation method, and secondly by predicting the same parameters from stand variables (parameter prediction method. The comparison was based on bias, mean absolute error, mean square error and the Reynolds’ index error (as a percentage. On the basis of the parameter estimation method, the Normal distribution gave slightly better results, whereas the Weibull distribution with the maximum likelihood approach gave the best results for the parameter prediction method. Hence, in the latter case, the Weibull distribution with the maximum likelihood approach appears to be the most suitable to estimate the parameters for reducing the different comparison criteria for the distribution of trees by diameter class in Tetraclinis articulata forests in Tunisia.

  19. Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal

    OpenAIRE

    2014-01-01

    In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn...

  20. An EOQ inventory model for items with ramp type demand, three-parameter Weibull distribution deterioration and starting with shortage

    Directory of Open Access Journals (Sweden)

    Jain Sanjay

    2010-01-01

    Full Text Available In this present paper an inventory model is developed with ramp type demand, starting with shortage and three - parameter Weibull distribution deterioration. A brief analysis of the cost involved is carried out by an example.

  1. A spatial scan statistic for survival data based on Weibull distribution.

    Science.gov (United States)

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions.

  2. Optimal Replenishment Policy for Weibull-Distributed Deteriorating Items with Trapezoidal Demand Rate and Partial Backlogging

    Directory of Open Access Journals (Sweden)

    Lianxia Zhao

    2016-01-01

    Full Text Available An inventory model for Weibull-distributed deteriorating items is considered so as to minimize the total cost per unit time in this paper. The model starts with shortage, allowed partial backlogging, and trapezoidal demand rate. By analyzing the model, an efficient solution procedure is proposed to determine the optimal replenishment and the optimal order quantity and the average total costs are also obtained. Finally, numerical examples are provided to illustrate the theoretical results and a sensitivity analysis of the major parameters with respect to the stability of optimal solution is also carried out.

  3. Comparison of Estimators for Exponentiated Inverted Weibull Distribution Based on Grouped Data Amal

    Directory of Open Access Journals (Sweden)

    S. Hassan

    2014-04-01

    Full Text Available In many situations, instead of complete sample, data is available only in grouped form. This paper presents estimation of population parameters for the exponentiated inverted Weibull distribution based on grouped data with equi and unequi-spaced grouping. Several alternative estimation schemes, such as, the method of maximum likelihood, least lines, least squares, minimum chi-square, and modified minimum chi-square are considered. Since the different methods of estimation didn't provide closed form solution, thus numerical procedure is applied. The root mean squared error resulting estimators used as comparison criterion to measure both the accuracy and the precision for each parameter.

  4. Deterministic inventory model for items with Time varying demand, weibull distribution deterioration and shortages

    Directory of Open Access Journals (Sweden)

    Wu Kun-Shan

    2002-01-01

    Full Text Available In this paper, an EOQ inventory model is depleted not only by time varying demand but also by Weibull distribution deterioration, in which the inventory is permitted to start with shortages and end without shortages. A theory is developed to obtain the optimal solution of the problem; it is then illustrated with the aid of several numerical examples. Moreover, we also assume that the holding cost is a continuous, non-negative and non-decreasing function of time in order to extend the EOQ model. Finally, sensitivity of the optimal solution to changes in the values of different system parameters is also studied.

  5. The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models

    Directory of Open Access Journals (Sweden)

    Zohdy M Nofal

    2017-06-01

    Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.

  6. Charged particle multiplicity and transverse energy distribution using Weibull-Glauber approach in heavy-ion collisions

    CERN Document Server

    Behera, Nirbhay K; Naik, Bharati; Nandi, Basanta K; Pani, Tanmay

    2016-01-01

    The charged particle multiplicity distribution and the transverse energy distribution measured in heavy-ion collisions at top RHIC and LHC energies are described using the two-component model approach based on convolution of Monte Carlo Glauber model with the Weibull model for particle production. The model successfully describes the multiplicity and transverse energy distribution of minimum bias collision data for a wide range of energies. We also propose that Weibull-Glauber model can be used to determine the centrality classes in heavy-ion collision as an alternative to the conventional Negative Binomial distribution for particle production.

  7. Analysis of wind speed distributions: Wind distribution function derived from minimum cross entropy principles as better alternative to Weibull function

    Energy Technology Data Exchange (ETDEWEB)

    Kantar, Yeliz Mert; Usta, Ilhan [Department of Statistics, Anadolu University, Eskisehir 26470 (Turkey)

    2008-05-15

    In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately. (author)

  8. Bonus-Malus System with the Claim Frequency Distribution is Geometric and the Severity Distribution is Truncated Weibull

    Science.gov (United States)

    Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.

    2016-01-01

    Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.

  9. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  10. Optimal pricing and lot-sizing decisions under Weibull distribution deterioration and trade credit policy

    Directory of Open Access Journals (Sweden)

    Manna S.K.

    2008-01-01

    Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.

  11. A Weibull distribution with power-law tails that describes the first passage time processes of foreign currency exchanges

    Science.gov (United States)

    Sazuka, Naoya; Inoue, Jun-Ichi

    2007-03-01

    A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.

  12. Weibull Distribution for Estimating the Parameters and Application of Hilbert Transform in case of a Low Wind Speed at Kolaghat

    Directory of Open Access Journals (Sweden)

    P Bhattacharya

    2016-09-01

    Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.

  13. Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter Weibull Distribution.

    Science.gov (United States)

    1981-12-01

    the variance of point estimators are given by Mendenhall and Scheaffer (Ref 17:269), for both biased and unbiased estimations. In addition to this...Weibull Distribution. Thesis, Wright-Patterson AFB, Ohio: Air Force Institute of Technology, December 1980. 17. Mendenhall, W. and R. L. Scheaffer

  14. Probabilistic Assessment of Earthquake Hazards: a Comparison among Gamma, Weibull, Generalized Exponential and Gamma Distributions

    Science.gov (United States)

    Pasari, S.

    2013-05-01

    Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Weibull, gamma, generalized exponential and lognormal distributions are quite established probability models in this recurrence interval estimation. Moreover these models share many important characteristics among themselves. In this paper, we aim to compare the effectiveness of these models in recurrence interval estimations and eventually in hazard analysis. To contemplate the appropriateness of these models, we use a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (200-320 N and 870-1000 E). The model parameters are estimated using modified maximum likelihood estimator (MMLE). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.

  15. Modeling Seed Germination of Ricinus communis Using Hydrothermal Time Model Developed on the Basis of Weibull Distribution

    Directory of Open Access Journals (Sweden)

    H Akbari

    2016-02-01

    Full Text Available Introduction Temperature and water potential are two of the most important environmental factors regulating the seed germination. The germination response of a population of seeds to temperature and water potential can be described on the basis of hydrothermal time (HTT model. Regardless of the wide use of HTT models to simulate germination, little research has critically examined the assumption that the base water potential within these models is normally distributed. An alternative to the normal distribution that can fit a range of distribution types is the Weibull distribution. Using germination data of Castor bean (Ricinus communis L. over a range of water potential and sub-optimal temperature, we compared the utility of the normal and Weibull distribution in estimating base water potential (b. The accuracy of their respective HTT models in predicting germination percentage across the sub-optimal temperature range was also examined. Materials and Methods Castor bean seed germination was tested across a range of water potential (0, -0.3, -0.6 and -0.9 MPa at the sub-optimal range of temperature (ranging from 10 to 35 ˚C, with 5 ˚C intervals. Osmotic solutions were prepared by dissolving polyethylene glycol 8000 in distilled water according to the Michel (1983 equation for a given temperature. Seed germination was tested on 4 replicates of 50 seeds in moist paper towels in the incubator. The HTT models, based on the normal and Weibull distributions were fitted to data from all combinations of temperatures and water potentials using the PROC NLMIXED procedure in SAS. Results and Discussion Based on both normal and Weibull distribution functions, hydrotime constant and base water potential for castor bean seed germination were declined by increasing the temperature. Reducing the values of base water potential showed the greater need to water uptake for germination at lower temperatures and reducing hydrotime constant indicated an increase

  16. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    Science.gov (United States)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  17. Bayesian Analysis of the Survival Function and Failure Rate of Weibull Distribution with Censored Data

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The survival function of the Weibull distribution determines the probability that a unit or an individual will survive beyond a certain specified time while the failure rate is the rate at which a randomly selected individual known to be alive at time will die at time (. The classical approach for estimating the survival function and the failure rate is the maximum likelihood method. In this study, we strive to determine the best method, by comparing the classical maximum likelihood against the Bayesian estimators using an informative prior and a proposed data-dependent prior known as generalised noninformative prior. The Bayesian estimation is considered under three loss functions. Due to the complexity in dealing with the integrals using the Bayesian estimator, Lindley’s approximation procedure is employed to reduce the ratio of the integrals. For the purpose of comparison, the mean squared error (MSE and the absolute bias are obtained. This study is conducted via simulation by utilising different sample sizes. We observed from the study that the generalised prior we assumed performed better than the others under linear exponential loss function with respect to MSE and under general entropy loss function with respect to absolute bias.

  18. Average capacity for optical wireless communication systems over exponentiated Weibull distribution non-Kolmogorov turbulent channels.

    Science.gov (United States)

    Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

    2014-06-20

    We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.

  19. STATISTICAL INFERENCE OF WEIBULL DISTRIBUTION FOR TAMPERED FAILURE RATE MODEL IN PROGRESSIVE STRESS ACCELERATED LIFE TESTING

    Institute of Scientific and Technical Information of China (English)

    WANG Ronghua; FEI Heliang

    2004-01-01

    In this note, the tampered failure rate model is generalized from the step-stress accelerated life testing setting to the progressive stress accelerated life testing for the first time. For the parametric setting where the scale parameter satisfying the equation of the inverse power law is Weibull, maximum likelihood estimation is investigated.

  20. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Directory of Open Access Journals (Sweden)

    Jose Javier Gorgoso-Varela

    2016-04-01

    Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.

  1. A class of generalized beta distributions, Pareto power series and Weibull power series

    OpenAIRE

    Lemos de Morais, Alice

    2009-01-01

    Nesta dissertação trabalhamos com três classes de distribuições de probabilidade, sendo uma já conhecida na literatura, a Classe de Distribuições Generalizadas Beta (Beta-G) e duas outras novas classes introduzidas nesta tese, baseadas na composição das distribuições Pareto e Weibull com a classe de distribuições discretas power series. Fazemos uma revisão geral da classe Beta-G e introduzimos um caso especial, a distribuição beta logística generalizada do tipo IV (BGL(IV)). In...

  2. Bayesian Inference of a Finite Mixture of Inverse Weibull Distributions with an Application to Doubly Censoring Data

    Directory of Open Access Journals (Sweden)

    Navid Feroze

    2016-03-01

    Full Text Available The families of mixture distributions have a wider range of applications in different fields such as fisheries, agriculture, botany, economics, medicine, psychology, electrophoresis, finance, communication theory, geology and zoology. They provide the necessary flexibility to model failure distributions of components with multiple failure modes. Mostly, the Bayesian procedure for the estimation of parameters of mixture model is described under the scheme of Type-I censoring. In particular, the Bayesian analysis for the mixture models under doubly censored samples has not been considered in the literature yet. The main objective of this paper is to develop the Bayes estimation of the inverse Weibull mixture distributions under doubly censoring. The posterior estimation has been conducted under the assumption of gamma and inverse levy using precautionary loss function and weighted squared error loss function. The comparisons among the different estimators have been made based on analysis of simulated and real life data sets.

  3. Influence of the Determination Methods of K and C Parameters on the Ability of Weibull Distribution to Suitably Estimate Wind Potential and Electric Energy

    Directory of Open Access Journals (Sweden)

    Ruben M. Mouangue

    2014-05-01

    Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.

  4. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING WEIBULL PARAMETERS FOR WIND SPEED DISTRIBUTION IN THE DISTRICT OF MAROUA

    Directory of Open Access Journals (Sweden)

    D. Kidmo Kaoga

    2014-12-01

    Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.

  5. Improvement in mechanical properties of jute fibres through mild alkali treatment as demonstrated by utilisation of the Weibull distribution model.

    Science.gov (United States)

    Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam

    2012-03-01

    Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Simplified Analysis of the Electric Power Losses for On-Shore Wind Farms Considering Weibull Distribution Parameters

    Directory of Open Access Journals (Sweden)

    Antonio Colmenar-Santos

    2014-10-01

    Full Text Available Electric power losses are constantly present during the service life of wind farms and must be considered in the calculation of the income arising from selling the produced electricity. It is typical to estimate the electrical losses in the design stage as those occurring when the wind farm operates at rated power, nevertheless, it is necessary to determine a method for checking if the actual losses meet the design requirements during the operation period. In this paper, we prove that the electric losses at rated power should not be considered as a reference level and a simple methodology will be developed to analyse and foresee the actual losses in a set period as a function of the wind resource in such period, defined according to the Weibull distribution, and the characteristics of the wind farm electrical infrastructure. This methodology facilitates a simple way, to determine in the design phase and to check during operation, the actual electricity losses.

  7. A Bayesian estimation on right censored survival data with mixture and non-mixture cured fraction model based on Beta-Weibull distribution

    Science.gov (United States)

    Yusuf, Madaki Umar; Bakar, Mohd. Rizam B. Abu

    2016-06-01

    Models for survival data that includes the proportion of individuals who are not subject to the event under study are known as a cure fraction models or simply called long-term survival models. The two most common models used to estimate the cure fraction are the mixture model and the non-mixture model. in this work, we present mixture and the non-mixture cure fraction models for survival data based on the beta-Weibull distribution. This four parameter distribution has been proposed as an alternative extension of the Weibull distribution in the analysis of lifetime data. This approach allows the inclusion of covariates in the models, where the estimation of the parameters was obtained under a Bayesian approach using Gibbs sampling methods.

  8. A deterministic inventory model for deteriorating items with selling price dependent demand and three-parameter Weibull distributed deterioration

    Directory of Open Access Journals (Sweden)

    Asoke Kumar Bhunia

    2014-06-01

    Full Text Available In this paper, an attempt is made to develop two inventory models for deteriorating items with variable demand dependent on the selling price and frequency of advertisement of items. In the first model, shortages are not allowed whereas in the second, these are allowed and partially backlogged with a variable rate dependent on the duration of waiting time up to the arrival of next lot. In both models, the deterioration rate follows three-parameter Weibull distribution and the transportation cost is considered explicitly for replenishing the order quantity. This cost is dependent on the lot-size as well as the distance from the source to the destination. The corresponding models have been formulated and solved. Two numerical examples have been considered to illustrate the results and the significant features of the results are discussed. Finally, based on these examples, the effects of different parameters on the initial stock level, shortage level (in case of second model only, cycle length along with the optimal profit have been studied by sensitivity analyses taking one parameter at a time keeping the other parameters as same.

  9. Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions

    Science.gov (United States)

    2015-12-01

    Monte Carlo Estimates of the Distributions of the Random Polygons of the Voronoi Tessellation with Respect to a Poisson Process, Journal of...BELVOIR, VA 22060-6201 ATTN: DTIC/ OCA DEPARTMENT OF DEFENSE CONTRACTORS QUANTERION SOLUTIONS, INC. 1680 TEXAS STREET, SE KIRTLAND AFB, NM 87117-5669 ATTN: DTRIAC

  10. Dependence of Weibull distribution parameters on the CNR threshold i wind lidar data

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph;

    2015-01-01

    The increase in height and area swept by the blades of wind turbines that harvest energy from the air flow in the lower atmosphere have raised a need for better understanding of the structure of the profiles of the wind, its gusts and the monthly to annual long-term, statistical distribution...

  11. Characterization of the wind behavior in Botucatu-SP region (Brazil) by Weibull distributing; Caracterizacao do comportamento eolico da regiao de Botucatu-SP atraves da distribuicao de Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia

    2008-07-01

    The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the

  12. Estimating Age Distributions of Base Flow in Watersheds Underlain by Single and Dual Porosity Formations Using Groundwater Transport Simulation and Weighted Weibull Functions

    Science.gov (United States)

    Sanford, W. E.

    2015-12-01

    Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially

  13. Mathematical Model for the Effect of Ghrelinon basal,GNRH Induced FSH and LH Secretion in Normal Women by using four Variate Weibull Distribution

    Directory of Open Access Journals (Sweden)

    S. Lakshmi

    2016-12-01

    Full Text Available In this paper, we introduce probability density function of four variate Weibull distributions. A multivariate survival function of Weibull Distribution is used for four variables. From the survival function, the probability density function and cumulative probability function are derived. Ghrelin may affect reproductive function in animals and humans.In the application part the experimental conditions of an acute injection of ghrelin (1g/kg to normal women, basal and GnRH-induced LH and FSH secretion were not affected and suggested that ghrelin does not play a major physiological role in gonadotrophin secretion in women.In the mathematical part, we have found that the Survival function of the curveis suddenly decreased in Mid-luteal phase compare with other phases. Pdf of the curve is suppressed in late follicular phase and it will be increased at the time of7min.Pdf for early follicular phase of control cycleis increased from 4 min .Also Pdf curve for early follicular phase with ghrelin administration and mid-luteal phase with ghrelin and GnRH are also increased at the time of 5 and 3 minutes respectively.

  14. Assessment of the fracture strength distribution and the Weibull parameters of fibres from a bundle test. Ermittlung der Festigkeitsverteilung und der Weibullparameter von Fasern aus einem Buendelzugversuch

    Energy Technology Data Exchange (ETDEWEB)

    Lienkamp, M. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany)); Exner, H.E. (Technische Hochschule Darmstadt, Fachgebiet Physikalische Metallkunde, Fachbereich Materialwissenschaft (Germany))

    1993-04-01

    Present test methods used to determine the strength distribution of high performance fibres are either time consuming or not very reliable. A method is used which enables the derivation of the strength distribution function from one single tensile test. The load/elongation diagram of a bundle of fibres is taken from an elongation-controlled tensile test. From the ratio of the measured load to a fictive load, necessary to obtain an identical elongation in the bundle assuming all fibres are intact, the fraction of broken fibres for each point of the load/elongation diagram is determined. From this the strength distribution function and the Weibull parametes of the fibres can be calculated. Application of this simple, but very effective method, is demonstrated for a schematic example and for three fibre materials (carbon, aramid and ceramic fibres). (orig.)

  15. The Weibull - log Weibull transition of interoccurrence times for synthetic and natural earthquakes

    CERN Document Server

    Hasumi, Tomohiro; Akimoto, Takuma; Aizawa, Yoji

    2008-01-01

    We have studied interoccurrence time distributions by analyzing the synthetic and three natural catalogs of the Japan Meteorological Agency (JMA), the Southern California Earthquake Data Center (SCEDC), and Taiwan Central Weather Bureau (TCWB) and revealed the universal feature of the interoccurrence time statistics, Weibull - log Weibull transition. This transition reinforces the view that the interoccurrence time statistics possess Weibull statistics and log-Weibull statistics. Here in this paper, the crossover magnitude from the superposition regime to the Weibull regime $m_c^2$ is proportional to the plate velocity. In addition, we have found the region-independent relation, $m_c^2/m_{max} = 0.54 \\pm 0.004$.

  16. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  17. 基于Weibull分布的Bond冲击破碎粒度分布特征%Particle Size Distribution Characteristics of Bond Impact Based on Weibull Distribution

    Institute of Scientific and Technical Information of China (English)

    蔡改贫; 郭进山; 夏刘洋

    2016-01-01

    为了对石灰石受冲击破碎后的颗粒粒度分布特征进行分析,采用Bond冲击破碎试验机对不同粒度的单个石灰石颗粒在不同摆锤冲击角度下进行冲击破碎试验.结果表明:Bond冲击破碎后石灰石颗粒粒度符合Weibull分布模型;破碎后颗粒的质量累积概率随冲击能量的增加而提高;破碎后颗粒的质量累积概率密度函数曲线峰值随着给矿粒度的增加而减小;冲击能量增加到一定数值后,冲击能量继续增加,破碎后石灰石各粒径颗粒的质量增加效果随给矿粒径增加而逐渐减弱;给矿粒度一定时,细粒径颗粒的增加幅度随着冲击能的增加而较小,破碎后颗粒的质量累积概率密度函数曲线的峰值随着冲击能的增加而提高;破碎后颗粒的质量累积概率密度函数曲线的宽度随给矿粒径的增加而增大.%To research the particle size distribution of limestone in the impact crusher,the experiments of single lime-stone of different size at different angles were carried out on Bond impact crushing test machine. The results showed that:the particle size of the limestone under Bond impact crushing is in line with the Weibull distribution. The mass cumulative proba-bility of the particles increases with the increase of the impact energy and the peak of broken particles ' mass accumulation probability density function decreases with the increase of feed size. When impact energy continues to increase,the quality in-crease effect of broken particle size of limestone is is gradually weakened as the feed size becomes largeness after the impact energy is increased to a certain value. While the feed size is certain,the increase amplitude of fine particle diminished and the peak of broken particles' mass accumulation probability density function increases as the impact energy increases,but the width of broken particles' mass accumulation probability density function curve widens with the increase of the feed size.

  18. Weibull analyses of bacterial interaction forces measured using AFM

    NARCIS (Netherlands)

    van der Mei, Henderina; de Vries, Jacob; Busscher, Hendrik

    2010-01-01

    Statistically significant conclusions from interaction forces obtained by AFM are difficult to draw because of large data spreads. Weibull analysis, common in macroscopic bond-strength analyses, takes advantage of this spread to derive a Weibull distribution, yielding the probability of occurrence o

  19. An EOQ Model for Items with Weibull Distribution Deterioration Rate%变质率呈Weibull分布的易变质物品的EOQ模型

    Institute of Scientific and Technical Information of China (English)

    王道平; 于俊娣; 李向阳

    2011-01-01

    基于需求和采购价格均为时变的EOQ模型,进一步考虑呈Weibull分布的变质对易变质物品库存管理的影响,建立了相应的EOQ模型,并对该模型进行仿真计算和主要参数的灵敏度分析.结果表明,该模型存在最优解且各主要参数对最优库存控制有不同程度的影响.%For deterioration items, the deterioration rate can be described by Weibull distribution. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demands and purchase prices is developed to analyze the effect of deteriorating items on inventory management. With this model,numerical analysis and parameter sensitivity analysis are done. It shows that an optimal solution for this problem exists and different parameters have different effect on the optimal inventory control policy.

  20. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning

    Directory of Open Access Journals (Sweden)

    Jinping Liu

    2016-06-01

    Full Text Available The topic of online product quality inspection (OPQI with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs, e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF, which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.

  1. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning

    Science.gov (United States)

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  2. Investigations on the Eigen {sup s}quare{sup c}oordinates method for the 2 {sup s}quare{sup p}arameter Weibull distribution of wind speed

    Energy Technology Data Exchange (ETDEWEB)

    Toure, S. [Cocody Univ. (Ivory Coast). Lab. d' Energie Solaire

    2005-04-01

    The 2-parameter Weibull distribution is the hypothesis that is widely used in the fitting studies of random series of wind speeds. Several procedures are used to find the set of the two fitting parameters k and c. From an experimental study, the fitting parameters were first determined by the regression method. The basic ideas of the Eigen-coordinates method were reported by previous works, in the case of the 4-parameter Stauffer distribution. In the present paper, the new method is applied to identify the 2-parameter Weibull distribution. The differential equation was identified. Then the study disclosed a linear relationship with two Eigen-coordinates. Two complemental errors {epsilon}{sub j} and e{sub j} were introduced, as criteria to assess the goodness-of-fit of the distribution. {epsilon}{sub j} was linked to the linear relationship. e{sub j} was used to test the goodness-of-fit between the observed and Weibull cumulative distribution functions. Then the fitting parameters were determined using the Eigen-coordinates method. The results showed a better reliability. (Author)

  3. determination of weibull parameters and analysis of wind power ...

    African Journals Online (AJOL)

    HOD

    Resulting from the analysis, the values of the average wind speed, the average daily wind power, the ... Keywords: Wind power potential, Energy production, Weibull distribution, Wind ... was added globally in 2015 indicating a 23.2% increase.

  4. Least Squares Best Fit Method for the Three Parameter Weibull Distribution: Analysis of Tensile and Bend Specimens with Volume or Surface Flaw Failure

    Science.gov (United States)

    Gross, Bernard

    1996-01-01

    Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

  5. A Modified Cramer-von Mises and Anderson-Darling Test for the Weibull Distribution with Unknown Location and Scale Parameters.

    Science.gov (United States)

    1981-12-01

    evaluated at xlx 2,... ,x (20:303). For the Weibull pdf, the likelihood function can be represented by: L - f(xlx 2,...,x n:K,8,C) (16) Since the...vs A 2 Critical Values, Level-.Ol, n-30 128 , 0 6N m m • w - APPENDIX E Computer Prgrams 129 Program to Calculate the Cramer-von Mises Critical Values...1.E-4) 4,4,.30 30 CONTINUE 4 CONTINUE CsJ-C(3) T83-THETA (3) EKSJ-EK (3) 66 RETURN END *EOR SEOR *EOF 143 Program to Evaluate the Endpoints c C

  6. 基于Weibull分布的MEMS器件冲击可靠性建模%MODELING FOR THE SHOCK RELIABILITY OF MEMS DEVICES BASED ON WEIBULL DISTRIBUTION

    Institute of Scientific and Technical Information of China (English)

    王永泉; 陈花玲; 赵建平; 朱子才

    2013-01-01

    提出一种针对MEMS(micro-electro-mechanical systems)器件机械失效进行可靠性建模与预测的概率方法.首先从材料力学性能的尺寸效应出发,介绍脆性材料断裂强度的不确定性及其Weibull概率分布;然后,针对典型的MEMS表面微加工工艺,推导得出基于牺牲层技术的淀积薄膜结构残余热应力表达式;在此基础上,以一种悬臂式MEMS多晶硅器件在冲击载荷下的断裂失效为研究实例,建立体现其尺度、工艺及载荷特性的可靠性分析模型,并利用相关文献对多晶硅力学性能的测试数据,对该器件的冲击可靠度进行定量计算.结果表明典型多晶硅MEMS结构具有高达103g ~104g数量级的抗冲击能力(g为重力加速度).同时可看出,MEMS可靠性受多种关联因素的综合影响,准确的可靠性建模及设计在很大程度上依赖于大量的微尺度下材料性能或行为的基础性实验数据.%A probabilistic approach to model and predict the reliability of MEMS ( micro-electro-mechanical systems) devices is proposed. Starting from the size effect of microstructures, the Weibull probability distribution for describing the uncertainty of brittle materials' fracture strength is presented at first. Then, aiming at a typical MEMS surface micro-machining process, which is characterized as the chemical vapor deposition and sacrificial layer technology, the thermal residual stress of thin films is derived. Based on this, the reliability assessment on the fracture failure of a polysilicon cantilevered device under shock load is performed as a case study. A reliability model of the device is established, which incorporates the scale, process and load characteristics to some extent. Using the testing data for the mechanical properties of polysilicon material provided by the relevant literatures, the quantitative shock reliability of the device is calculated. The analysis show that typical polysilicon MEMS structures can

  7. Modelling diameter distributions of Quercus suber L. stands in “Los Alcornocales” Natural Park (Cádiz-Málaga, Spain by using the two-parameter Weibull function

    Directory of Open Access Journals (Sweden)

    A. Calzado

    2013-04-01

    Full Text Available Aim of study: The aim of this work was to model diameter distributions of Quercus suber stands. The ultimate goal was to construct models enabling the development of more affordable forest inventory methods. This is the first study of this type on cork oak forests in the area.Area of study: The area of study is “Los Alcornocales” Natural Park (Cádiz-Málaga, Spain.Material and methods: The diameter distributions of 100 permanent plots were modelled with the two-parameter Weibull function. Distribution parameters were fitted with the non-linear regression, maximum likelihood, moment and percentile-based methods. Goodness of fit with the different methods was compared in terms of number of plots rejected by the Kolmogorov-Smirnov test, bias, mean square error and mean absolute error. The scale and shape parameters in the Weibull function were related to the stand variables by using the parameter prediction model.Main results: The best fitting was obtained with the non-linear regression approach, using as initial values those obtained by maximum likelihood method, the percentage of rejections by the Kolmogorov-Smirnov test was 2% of the total number of cases. The scale parameter (b was successfully modelled in terms of the quadratic mean diameter under cork (R2 adj = 0.99. The shape parameter (c was modelled by using maximum diameter, minimum diameter and plot elevation (R2 adj = 0.40.Research highlights: The proposed model diameter distribution can be a highly useful tool for the inventorying and management of cork oak forests.Key words: maximum likelihood method; moment method; non linear regression approach; parameter prediction model; percentile method; scale parameter; shape parameter.

  8. 威布尔分布随机载荷下齿轮弯曲疲劳试验分析%Experimental study of gear bending fatigue strength under random load according to three-parameter Weibull distribution

    Institute of Scientific and Technical Information of China (English)

    胡建军; 许洪斌; 高孝旺; 祖世华

    2012-01-01

    根据齿轮传动过程中普遍承受的三参数威布尔分布载荷谱,编制了试验用随机变幅疲劳载荷谱,在MTS电液伺服疲劳试验机上利用成组试验方法完成了该随机载荷作用下齿轮弯曲疲劳试验,得到了特定变异系数三参数威布尔分布载荷谱下齿轮弯曲强度的S-N曲线。试验结果证明,在服从三参数威布尔分布随机载荷谱下,随机变幅疲劳试验得出的轮齿疲劳寿命远低于恒载荷疲劳试验得出的疲劳寿命。对随机载荷下的齿轮设计的疲劳极限的理论值进行了预测,并与试验结果进行了比较。随机载荷下的理论值与试验结果相吻合,因此可以通过随机载荷谱的载荷比例系数去推断随机载荷下齿轮弯曲疲劳强度值。%Random-amplitude fatigue load spectrum for experiments is made according to the ubiquitous three-parameter Weibull distribution in gear transmission.Gear bending fatigue test under the random load is carried out on a MTS electro-hydraulic servo material fatigue tester by using group testing method,and the S-N curve of gear bending strength under three-parameter Weibull distribution with specific variation coefficients is obtained.The fatigue test results show the gear's endurance life under random load is far less than that under constant load when the load submits to three-parameter Weibull distribution random load spectrum.The theoretical value of fatigue limit for gear under random load is predicated and compared with test results.The theoretical value is in accordance with the test results.Therefore,the fatigue strength of gear bending under random load can be deduced according to the load ratio coefficient of random load spectrum.

  9. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    Science.gov (United States)

    Barrows, R. G.

    1977-01-01

    Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  10. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    Science.gov (United States)

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively.

  11. Comparison of Parameter Estimation Methods for Transformer Weibull Lifetime Modelling

    Institute of Scientific and Technical Information of China (English)

    ZHOU Dan; LI Chengrong; WANG Zhongdong

    2013-01-01

    Two-parameter Weibull distribution is the most widely adopted lifetime model for power transformers.An appropriate parameter estimation method is essential to guarantee the accuracy of a derived Weibull lifetime model.Six popular parameter estimation methods (i.e.the maximum likelihood estimation method,two median rank regression methods including the one regressing X on Y and the other one regressing Y on X,the Kaplan-Meier method,the method based on cumulative hazard plot,and the Li's method) are reviewed and compared in order to find the optimal one that suits transformer's Weibull lifetime modelling.The comparison took several different scenarios into consideration:10 000 sets of lifetime data,each of which had a sampling size of 40 ~ 1 000 and a censoring rate of 90%,were obtained by Monte-Carlo simulations for each scienario.Scale and shape parameters of Weibull distribution estimated by the six methods,as well as their mean value,median value and 90% confidence band are obtained.The cross comparison of these results reveals that,among the six methods,the maximum likelihood method is the best one,since it could provide the most accurate Weibull parameters,i.e.parameters having the smallest bias in both mean and median values,as well as the shortest length of the 90% confidence band.The maximum likelihood method is therefore recommended to be used over the other methods in transformer Weibull lifetime modelling.

  12. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

    Directory of Open Access Journals (Sweden)

    Jenq-Daw Lee

    2008-07-01

    Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

  13. Uncertainty Evaluation of Weibull Estimators through Monte Carlo Simulation: Applications for Crack Initiation Testing

    Directory of Open Access Journals (Sweden)

    Jae Phil Park

    2016-06-01

    Full Text Available The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.

  14. Caracterização analítica e geométrica da metodologia geral de determinação de distribuições de Weibull para o regime eólico e suas aplicações Analytical and geometric characterization of general methodology of determination of Weibull distribution for wind regime and its applications

    Directory of Open Access Journals (Sweden)

    Luís R. A Gabriel Filho

    2011-02-01

    Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic

  15. Scaling Analysis of the Tensile Strength of Bamboo Fibers Using Weibull Statistics

    Directory of Open Access Journals (Sweden)

    Jiaxing Shao

    2013-01-01

    Full Text Available This study demonstrates the effect of weak-link scaling on the tensile strength of bamboo fibers. The proposed model considers the random nature of fiber strength, which is reflected by using a two-parameter Weibull distribution function. Tension tests were performed on samples that could be scaled in length. The size effects in fiber length on the strength were analyzed based on Weibull statistics. The results verify the use of Weibull parameters from specimen testing for predicting the strength distributions of fibers of longer gauge lengths.

  16. An Extension to the Weibull Process Model

    Science.gov (United States)

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  17. Weibull distribution for modeling drying of grapes and its application%基于Weibull分布函数的葡萄干燥过程模拟及应用

    Institute of Scientific and Technical Information of China (English)

    白竣文; 王吉亮; 肖红伟; 巨浩羽; 刘嫣红; 高振江

    2013-01-01

      为了探究 Weibull 分布函数中各参数的影响因素及其在干燥中的应用,该文以不同干燥方法(气体射流冲击干燥、真空脉动干燥)、干燥温度(50、55、60和65℃)以及烫漂预处理(30、60、90、120 s)的葡萄干燥过程为研究对象,利用Weibull分布函数对其干燥动力学曲线进行模拟并分析。研究结果表明:Weibull分布函数能够很好的模拟葡萄在试验条件下的干燥过程;尺度参数α与干燥温度有关,并且随着干燥温度的升高而降低;形状参数β与干燥方式和物料状态有关,但干燥温度对形状参数β的影响很小。计算了葡萄在干燥过程中的水分扩散系数Dcal在0.2982×10-9~2.7700×10-9 m2/s 之间,并根据阿伦尼乌斯公式计算出热风干燥和真空脉动干燥方法的干燥活化能分别为72.87和61.43 kJ/mol。研究结果为Weibull分布函数在葡萄干燥过程的应用提供参考。%Grapes as a seasonal fruit, have relatively high sugar content and moisture content, and are very sensitive to microbial spoilage during storage. Therefore, grapes once harvested must be consumed or processed into various products within a few weeks in order to reduce economic losses. Drying grapes into raisins is the major processing method in almost all countries where grapes are grown. The knowledge of the drying mechanism is very necessary for heat and moisture transportation efficiency, energy savings and product quality. Several different empirical and semi-empirical drying models were used for describing and predicting drying curves. Some of these models could give a good fit to the drying curves, but the basic idea of process characterization was to consider the process as a ‘‘black box’’--the drying materials and drying conditions were difficult to be related to the parameters of these models used. In this study, the Weibull distribution model was applied to the drying process under different

  18. Weibull Probability Model for Fracture Strength of Aluminium (1101)-Alumina Particle Reinforced Metal Matrix Composite

    Institute of Scientific and Technical Information of China (English)

    A.Suresh Babu; V.Jayabalan

    2009-01-01

    In recent times, conventional materials are replaced by metal matrix composites (MMCs) due to their high specific strength and modulus.Strength reliability, one of the key factors restricting wider use of composite materials in various applications, is commonly characterized by Weibull strength distribution function.In the present work, statistical analysis of the strength data of 15% volume alumina particle (mean size 15 μm)reinforced in aluminum alloy (1101 grade alloy) fabricated by stir casting method was carried out using Weibull probability model.Twelve tension tests were performed according to ASTM B577 standards and the test data, the corresponding Weibull distribution was obtained.Finally the reliability of the composite behavior in terms of its fracture strength was presented to ensure the reliability of composites for suitable applications.An important implication of the present study is that the Weibull distribution describes the experimentally measured strength data more appropriately.

  19. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    OpenAIRE

    Abul Kalam Azad; Mohammad Golam Rasul; Talal Yusaf

    2014-01-01

    The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM), method of moments (MOM), standard deviation method (STDM), maximum likelihood method (MLM), power density method (PDM), modified maximum likelihood method (MMLM) and equivalent energy method (EEM) were used to estimate the Weibull parameters and six statistical tools, name...

  20. 双参数威布尔分布在核电站数据处理中的应用%Application of Two-Parameter Weibull Distribution in Nuclear Power Plant Data Processing

    Institute of Scientific and Technical Information of China (English)

    刘方亮; 刘井泉; 刘伟

    2011-01-01

    核电站设备可靠性数据的处理是电站进行以可靠性为中心的维修(RCM)和寿期管理(LCM)的基础.在核电站失效数据的实际处理过程中,常会面临失效样本少、维修导致数据分布不独立等问题.为解决上述问题,本文提出以双参数威布尔分布作为寿命模型、采用贝叶斯方法来处理小样本失效数据的方法,并结合核电站运行数据进行验证.结果表明,本方法在处理样本较少以及存在维修老化问题时,具有更好的适用性和准确度.%The equipment reliability data processing is the basis of reliability centered maintenance (RCM) and life cycle management (LCM) in nuclear power plant. However, in actual failure data processing, the problems such as small-sample and non-independent data caused by maintenance are put forward. To resolve the problems, a processing method combined double-parameter Weibull distribution as the life model and Bayesian method for small samples was proposed, and was validated using actual nuclear power plant operating data. The results show that the processing method has better applicability and accuracy to deal with the situation of small samples and the problems of repairing and aging in nuclear power plant.

  1. EOQ Model for Items with Weibull Distribution Rate and Partial Backlogging%Weibull变质率和短缺量拖后的易变质物品EOQ模型

    Institute of Scientific and Technical Information of China (English)

    王道平; 于俊娣; 李向阳

    2011-01-01

    基于需求和采购价格均为时变的EOQ模型,考虑物品的变质率呈更符合现实情况的三参数Weibull分布,同时考虑短缺量拖后和资金时值对易变质物品库存管理的影响,构建了相应的EOQ模型.应用数学软件Matlab对该库存模型进行仿真计算和主要影响参数的灵敏度分析.结果表明,该模型存在最优解,且各主要影响参数对最优库存控制各有不同程度的影响,资金时值对库存总成本净现值的影响程度要甚于短缺量拖后的影响,故在制定科学的库存策略时资金时值需要更加关.注.%For deteriorating items, the deterioration rate can be described by Weibull distribution that is reality-oriented. Basing on this assumption, a new economic order quantity (EOQ) model with time-varying demand and purchase price and partial backlogging and time-value of system cost is developed to analyze the effect of deteriorating items on inventory management. With this model, numerical analysis and parameter sensitivity analysis are done. It shows that an optimum solution for this problem exists and different parameters have different effect on the optimal inventory control policy. Effection of time-value of system cost on net total cost of inventory was more than partial backlogging, therefore the scientific police of inventory should be paid more attention to time-value of system cost.

  2. Bias in the Weibull Strength Estimation of a SiC Fiber for the Small Gauge Length Case

    Science.gov (United States)

    Morimoto, Tetsuya; Nakagawa, Satoshi; Ogihara, Shinji

    It is known that the single-modal Weibull model describes well the size effect of brittle fiber tensile strength. However, some ceramic fibers have been reported that single-modal Weibull model provided biased estimation on the gauge length dependence. A hypothesis on the bias is that the density of critical defects is very small, thus, fracture probability of small gauge length samples distributes in discrete manner, which makes the Weibull parameters dependent on the gauge length. Tyranno ZMI Si-Zr-C-O fiber has been selected as an example fiber. The tensile tests have been done on several gauge lengths. The derived Weibull parameters have shown a dependence on the gauge length. Fracture surfaces were observed with SEM. Then we classified the fracture surfaces into the characteristic fracture patterns. Percentage of each fracture pattern was found dependent on the gauge length, too. This may be an important factor of the Weibull parameter dependence on the gauge length.

  3. NEW DOCTORAL DEGREE Parameter estimation problem in the Weibull model

    OpenAIRE

    Marković, Darija

    2009-01-01

    In this dissertation we consider the problem of the existence of best parameters in the Weibull model, one of the most widely used statistical models in reliability theory and life data theory. Particular attention is given to a 3-parameter Weibull model. We have listed some of the many applications of this model. We have described some of the classical methods for estimating parameters of the Weibull model, two graphical methods (Weibull probability plot and hazard plot), and two analyt...

  4. Probability Density Function Characterization for Aggregated Large-Scale Wind Power Based on Weibull Mixtures

    Directory of Open Access Journals (Sweden)

    Emilio Gómez-Lázaro

    2016-02-01

    Full Text Available The Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC and the Bayesian information criterion (BIC. Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.

  5. Weibull-k Revisited: “Tall” Profiles and Height Variation of Wind Statistics

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Troen, Ib; Ejsing Jørgensen, Hans

    2014-01-01

    with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter (k......-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85–110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen...... (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull-k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion...

  6. Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

    Science.gov (United States)

    Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

    2012-01-01

    A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

  7. Analisis Probabilitas Kecepatan Angin untuk Pesisir Cilacap dengan Menerapkan Distribusi Weibull dan Rayleigh

    Directory of Open Access Journals (Sweden)

    Wahyu Widiyanto

    2013-06-01

    Full Text Available Wind characteristics especially the event probability have been more studied in the relation to wind energy availability in an area. Nevertheless, in the relation to coastal structure, it is still rare to be unveiled in a paper particulary in Indonesia. In this article, therefore, it is studied probability distribution commonly used to wind energy analysis i.e. Weibull and Rayleigh distribution. The distribution is applied to analyze wind data in Cilacap Coast. Wind data analyzed is from Board of Meteorology, Climatology and Geophysics, Cilacap branch, along two years (2009 – 2011. Mean, varians and standard deviation are founded to calculate shape factor (k and scale factor (c which must be available to arrange distribution function of Weibull and Rayleigh. In the region, it gains a result that wind speed probabilities follow Weibull and Rayleigh function fairly. Shape parameter value has been gotten k = 3,26, while scale parameter has been gotten respectively c = 3,64 for Weibull and Cr = 2,44 for Rayleigh. Value of k ≥ 3 indicates the region has regular and steady wind. Besides, mean speed of wind is 3,3 m/s.

  8. Modeling root reinforcement using root-failure Weibull survival function

    Directory of Open Access Journals (Sweden)

    M. Schwarz

    2013-03-01

    Full Text Available Root networks contribute to slope stability through complicated interactions that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamic of root turnover, the quantification of root reinforcement on steep slope is challenging and consequently the calculation of slope stability as well. Although the considerable advances in root reinforcement modeling, some important aspect remain neglected. In this study we address in particular to the role of root strength variability on the mechanical behaviors of a root bundle. Many factors may contribute to the variability of root mechanical properties even considering a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw. The results show that, for both laboratory and field datasets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the tensile force and the elasticity of the roots are the most important equations, as well as the root distribution. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root

  9. Statistical Analysis of Wind Power Density Based on the Weibull and Rayleigh Models of Selected Site in Malaysia

    Directory of Open Access Journals (Sweden)

    Aliashim Albani

    2014-02-01

    Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.

  10. Moment series for the coefficient of variation in Weibull sampling

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, K.O.; Shenton, L.R.

    1981-01-01

    For the 2-parameter Weibull distribution function F(t) = 1 - exp(-t/b)/sup c/, t > 0, with c and b positive, a moment estimator c* for c is the solution of the equationGAMMA(1 + 2/c*)/GAMMA/sup 2/ (1 + 1/c*) = 1 + v*/sup 2/ where v* is the coefficient of variation in the form ..sqrt..m/sub 2//m/sub 1/', m/sub 1/' being the sample mean, m/sub 2/ the sample second central moment (it is trivial in the present context to replace m/sub 2/ by the variance). One approach to the moments of c* (Bowman and Shenton, 1981) is to set-up moment series for the scale-free v*. The series are apparently divergent and summation algorithms are essential; we consider methods due to Levin (1973) and one, introduced ourselves (Bowman and Shenton, 1976).

  11. Weibull-Based Design Methodology for Rotating Structures in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Erwin V. Zaretsky

    2003-01-01

    Full Text Available The NASA Energy-Efficient Engine (E3-Engine is used as the basis of a Weibull-based life and reliability analysis. Each component's life, and thus the engine's life, is defined by high-cycle fatigue or low-cycle fatigue. Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine's Weibull slope increases, the predicted life decreases. The predicted engine lives L5 (95% probability of survival of approximately 17,000 and 32,000 hr do correlate with current engine-maintenance practices without and with refurbishment, respectively. The individual high-pressure turbine (HPT blade lives necessary to obtain a blade system life L0.1 (99.9% probability of survival of 9000 hr for Weibull slopes of 3, 6, and 9 are 47,391; 20,652; and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9%, the predicted disk system life L0.1 can vary from 9408 to 24,911 hr.

  12. On Weibull's Spectrum of Non-relativistic Energetic Particles at IP Shocks: Observations and Theoretical Interpretation

    Science.gov (United States)

    Pallocchia, G.; Laurenza, M.; Consolini, G.

    2017-03-01

    Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.

  13. Weibull statistical analysis of tensile strength of vascular bundle in inner layer of moso bamboo culm in molecular parasitology and vector biology.

    Science.gov (United States)

    Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

    2014-07-01

    Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.

  14. Bayesian Estimation and Prediction for Flexible Weibull Model under Type-II Censoring Scheme

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Singh

    2013-01-01

    Full Text Available We have developed the Bayesian estimation procedure for flexible Weibull distribution under Type-II censoring scheme assuming Jeffrey's scale invariant (noninformative and Gamma (informative priors for the model parameters. The interval estimation for the model parameters has been performed through normal approximation, bootstrap, and highest posterior density (HPD procedures. Further, we have also derived the predictive posteriors and the corresponding predictive survival functions for the future observations based on Type-II censored data from the flexible Weibull distribution. Since the predictive posteriors are not in the closed form, we proposed to use the Monte Carlo Markov chain (MCMC methods to approximate the posteriors of interest. The performance of the Bayes estimators has also been compared with the classical estimators of the model parameters through the Monte Carlo simulation study. A real data set representing the time between failures of secondary reactor pumps has been analysed for illustration purpose.

  15. Investigation of Weibull statistics in fracture analysis of cast aluminum

    Science.gov (United States)

    Holland, F. A., Jr.; Zaretsky, E. V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  16. Investigation of Weibull statistics in fracture analysis of cast aluminum

    Science.gov (United States)

    Holland, F. A., Jr.; Zaretsky, E. V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  17. Biological implications of the Weibull and Gompertz models of aging.

    Science.gov (United States)

    Ricklefs, Robert E; Scheuerlein, Alex

    2002-02-01

    Gompertz and Weibull functions imply contrasting biological causes of demographic aging. The terms describing increasing mortality with age are multiplicative and additive, respectively, which could result from an increase in the vulnerability of individuals to extrinsic causes in the Gompertz model and the predominance of intrinsic causes at older ages in the Weibull model. Experiments that manipulate extrinsic mortality can distinguish these biological models. To facilitate analyses of experimental data, we defined a single index for the rate of aging (omega) for the Weibull and Gompertz functions. Each function described the increase in aging-related mortality in simulated ages at death reasonably well. However, in contrast to the Weibull omega(W), the Gompertz omega(G) was sensitive to variation in the initial mortality rate independently of aging-related mortality. Comparisons between wild and captive populations appear to support the intrinsic-causes model for birds, but give mixed support for both models in mammals.

  18. Lifetime assessment by intermittent inspection under the mixture Weibull power law model with application to XLPE cables.

    Science.gov (United States)

    Hirose, H

    1997-01-01

    This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.

  19. Determination of Reliability Index and Weibull Modulus as a Measure of Hypereutectic Silumins Survival

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2007-07-01

    Full Text Available The first part of the study describes the methods used to determine Weibull modulus and the related reliability index of hypereutectic silumins containing about 17% Si, assigned for manufacture of high-duty castings to be used in automotive applications and aviation. The second part of the study discusses the importance of chemical composition, including the additions of 3% Cu, 1,5% Ni and 1,5% Mg, while in the third part attention was focussed on the effect of process history, including mould type (sand or metal as well as the inoculation process and heat treatment (solutioning and ageing applied to the cast AlSi17Cu3Mg1,5Ni1,5 alloy, on the run of Weibull distribution function and reliability index calculated for the tensile strength Rm of the investigated alloys.

  20. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  1. An inventory model for generalized weibull deteriorating items with price dependent demand and permissible delay in payments under inflation

    OpenAIRE

    SINGH, S. P.; G.C.Panda

    2015-01-01

    This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.

  2. An inventory model for generalized weibull deteriorating items with price dependent demand and permissible delay in payments under inflation

    Directory of Open Access Journals (Sweden)

    S.P.Singh

    2015-09-01

    Full Text Available This paper derives an inventory model is developed for items that deteriorates at a generalized Weibull distributed rate when demand for the items is dependent on the selling price. Shortages are not allowed and price inflation is taken into consideration over finite planning horizon. A brief analysis of the cost involved is carried out by theoretical analysis.

  3. A Weibull multi-state model for the dependence of progression-free survival and overall survival.

    Science.gov (United States)

    Li, Yimei; Zhang, Qiang

    2015-07-30

    In oncology clinical trials, overall survival, time to progression, and progression-free survival are three commonly used endpoints. Empirical correlations among them have been published for different cancers, but statistical models describing the dependence structures are limited. Recently, Fleischer et al. proposed a statistical model that is mathematically tractable and shows some flexibility to describe the dependencies in a realistic way, based on the assumption of exponential distributions. This paper aims to extend their model to the more flexible Weibull distribution. We derived theoretical correlations among different survival outcomes, as well as the distribution of overall survival induced by the model. Model parameters were estimated by the maximum likelihood method and the goodness of fit was assessed by plotting estimated versus observed survival curves for overall survival. We applied the method to three cancer clinical trials. In the non-small-cell lung cancer trial, both the exponential and the Weibull models provided an adequate fit to the data, and the estimated correlations were very similar under both models. In the prostate cancer trial and the laryngeal cancer trial, the Weibull model exhibited advantages over the exponential model and yielded larger estimated correlations. Simulations suggested that the proposed Weibull model is robust for data generated from a range of distributions.

  4. EFFECT OF NANOPOWDER ADDITION ON THE FLEXURAL STRENGTH OF ALUMINA CERAMIC - A WEIBULL MODEL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Daidong Guo

    2016-05-01

    Full Text Available Alumina ceramics were prepared either with micrometer-sized alumina powder (MAP or with the addition of nanometer-sized alumina powder (NAP. The density, crystalline phase, flexural strength and the fracture surface of the two ceramics were measured and compared. Emphasis has been put on the influence of nanopowder addition on the flexural strength of Al₂O₃ ceramic. The analysis based on the Weibull distribution model suggests the distribution of the flexural strength of the NAP ceramic is more concentrated than that of the MAP ceramic. Therefore, the NAP ceramics will be more stable and reliable in real applications.

  5. Effects of Cracking Test Conditions on Estimation Uncertainty for Weibull Parameters Considering Time-Dependent Censoring Interval

    Directory of Open Access Journals (Sweden)

    Jae Phil Park

    2016-12-01

    Full Text Available It is extremely difficult to predict the initiation time of cracking due to a large time spread in most cracking experiments. Thus, probabilistic models, such as the Weibull distribution, are usually employed to model the initiation time of cracking. Therefore, the parameters of the Weibull distribution are estimated from data collected from a cracking test. However, although the development of a reliable cracking model under ideal experimental conditions (e.g., a large number of specimens and narrow censoring intervals could be achieved in principle, it is not straightforward to quantitatively assess the effects of the ideal experimental conditions on model estimation uncertainty. The present study investigated the effects of key experimental conditions, including the time-dependent effect of the censoring interval length, on the estimation uncertainties of the Weibull parameters through Monte Carlo simulations. The simulation results provided quantified estimation uncertainties of Weibull parameters in various cracking test conditions. Hence, it is expected that the results of this study can offer some insight for experimenters developing a probabilistic crack initiation model by performing experiments.

  6. An Approach to Determine the Weibull Parameters and Wind Power Analysis of Saint Martin’s Island, Bangladesh

    Directory of Open Access Journals (Sweden)

    Islam Khandaker Dahirul

    2016-01-01

    Full Text Available This paper explores wind speed distribution using Weibull probability distribution and Rayleigh distribution methods that are proven to provide accurate and efficient estimation of energy output in terms of wind energy conversion systems. Two parameters of Weibull (shape and scale parameters k and c respectively and scale parameter of Rayleigh distribution have been determined based on hourly time-series wind speed data recorded from October 2014 to October 2015 at Saint Martin’s island, Bangladesh. This research has been carried out to examine three numerical methods namely Graphical Method (GM, Empirical Method (EM, Energy Pattern Factor method (EPF to estimate Weibull parameters. Also, Rayleigh distribution method has been analyzed throughout the study. The results in the research revealed that the Graphical method followed by Empirical method and Energy Pattern Factor method were the most accurate and efficient way for determining the value of k and c to approximate wind speed distribution in terms of estimating power error. Rayleigh distribution gives the most power error in the research. Potential for wind energy development in Saint Martin’s island, Bangladesh as found from the data analysis has been explained in this paper.

  7. Analysis of tensile bond strengths using Weibull statistics.

    Science.gov (United States)

    Burrow, Michael F; Thomas, David; Swain, Mike V; Tyas, Martin J

    2004-09-01

    Tensile strength tests of restorative resins bonded to dentin, and the resultant strengths of interfaces between the two, exhibit wide variability. Many variables can affect test results, including specimen preparation and storage, test rig design and experimental technique. However, the more fundamental source of variability, that associated with the brittle nature of the materials, has received little attention. This paper analyzes results from micro-tensile tests on unfilled resins and adhesive bonds between restorative resin composite and dentin in terms of reliability using the Weibull probability of failure method. Results for the tensile strengths of Scotchbond Multipurpose Adhesive (3M) and Clearfil LB Bond (Kuraray) bonding resins showed Weibull moduli (m) of 6.17 (95% confidence interval, 5.25-7.19) and 5.01 (95% confidence interval, 4.23-5.8). Analysis of results for micro-tensile tests on bond strengths to dentin gave moduli between 1.81 (Clearfil Liner Bond 2V) and 4.99 (Gluma One Bond, Kulzer). Material systems with m in this range do not have a well-defined strength. The Weibull approach also enables the size dependence of the strength to be estimated. An example where the bonding area was changed from 3.1 to 1.1 mm diameter is shown. Weibull analysis provides a method for determining the reliability of strength measurements in the analysis of data from bond strength and tensile tests on dental restorative materials.

  8. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  9. Performance Improvement in Spatially Multiplexed MIMO Systems over Weibull-Gamma Fading Channel

    Science.gov (United States)

    Tiwari, Keerti; Saini, Davinder S.; Bhooshan, Sunil V.

    2016-11-01

    In multiple-input multiple-output (MIMO) systems, spatial demultiplexing at the receiver has its own significance. Thus, several detection techniques have been investigated. There is a tradeoff between computational complexity and optimal performance in most of the detection techniques. One of the detection techniques which gives improved performance and acceptable level of complexity is ordered successive interference cancellation (OSIC) with minimum mean square error (MMSE). However, optimal performance can be achieved by maximum likelihood (ML) detection but at a higher complexity level. Therefore, MMSE-OSIC with candidates (OSIC2) detection is recommended as a solution. In this paper, spatial multiplexed (SM) MIMO systems are considered to evaluate error performance with different detection techniques such as MMSE-OSIC, ML and MMSE-OSIC2 in a composite fading i. e. Weibull-gamma (WG) fading environment. In WG distribution, Weibull and gamma distribution represent multipath and shadowing effects, respectively. Simulation results illustrate that MMSE-OSIC2 detection technique gives the improved symbol error rate (SER) performance which is similar to ML performance and its complexity level approaches to MMSE-OSIC.

  10. Weibull Effective Area for Hertzian Ring Crack Initiation Stress

    Energy Technology Data Exchange (ETDEWEB)

    Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL

    2011-01-01

    Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.

  11. Regional flood frequency analysis based on a Weibull model: Part 1. Estimation and asymptotic variances

    Science.gov (United States)

    Heo, Jun-Haeng; Boes, D. C.; Salas, J. D.

    2001-02-01

    Parameter estimation in a regional flood frequency setting, based on a Weibull model, is revisited. A two parameter Weibull distribution at each site, with common shape parameter over sites that is rationalized by a flood index assumption, and with independence in space and time, is assumed. The estimation techniques of method of moments and method of probability weighted moments are studied by proposing a family of estimators for each technique and deriving the asymptotic variance of each estimator. Then a single estimator and its asymptotic variance for each technique, suggested by trying to minimize the asymptotic variance over the family of estimators, is obtained. These asymptotic variances are compared to the Cramer-Rao Lower Bound, which is known to be the asymptotic variance of the maximum likelihood estimator. A companion paper considers the application of this model and these estimation techniques to a real data set. It includes a simulation study designed to indicate the sample size required for compatibility of the asymptotic results to fixed sample sizes.

  12. A comparison of estimation methods for fitting Weibull, Johnson's SB and beta functions to Pinus pinaster, Pinus radiata and Pinus sylvestris stands in northwest Spain

    Energy Technology Data Exchange (ETDEWEB)

    Gorgoseo, J. J.; Rojo, A.; Camara-Obregon, A.; Dieguez-Aranda, U.

    2012-07-01

    The purpose of this study was to compare the accuracy of the Weibull, Johnson's SB and beta distributions, fitted with some of the most usual methods and with different fixed values for the location parameters, for describing diameter distributions in even-aged stands of Pinus pinaster, Pinus radiata and Pinus sylvestris in northwest Spain. A total of 155 permanent plots in Pinus sylvestris stands throughout Galicia, 183 plots in Pinus pinaster stands throughout Galicia and Asturias and 325 plots in Pinus radiata stands in both regions were measured to describe the diameter distributions. Parameters of the Weibull function were estimated by Moments and Maximum Likelihood approaches, those of Johnson's SB function by Conditional Maximum Likelihood and by Knoebel and Burkhart's method, and those of the beta function with the method based on the moments of the distribution. The beta and the Johnson's SB functions were slightly superior to Weibull function for Pinus pinaster stands; the Johnson's SB and beta functions were more accurate in the best fits for Pinus radiata stands, and the best results of the Weibull and the Johnson's SB functions were slightly superior to beta function for Pinus sylvestris stands. However, the three functions are suitable for this stands with an appropriate value of the location parameter and estimation of parameters method. (Author) 44 refs.

  13. An Approach to Determine the Weibull Parameters for Wind Energy Analysis: The Case of Galicia (Spain

    Directory of Open Access Journals (Sweden)

    Camilo Carrillo

    2014-04-01

    Full Text Available The Weibull probability density function (PDF has mostly been used to fit wind speed distributions for wind energy applications. The goodness of fit of the results depends on the estimation method that was used and the wind type of the analyzed area. In this paper, a study on a particular area (Galicia was performed to test the performance of several fitting methods. The goodness of fit was evaluated by well-known indicators that use the wind speed or the available wind power density. However, energy production must be a critical parameter in wind energy applications. Hence, a fitting method that accounts for the power density distribution is proposed. To highlight the usefulness of this method, indicators that use energy production values are also presented.

  14. Prediction and reconstruction of future and missing unobservable modified Weibull lifetime based on generalized order statistics

    Directory of Open Access Journals (Sweden)

    Amany E. Aly

    2016-04-01

    Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.

  15. On the Performance Analysis of Digital Communications over Weibull-Gamma Channels

    KAUST Repository

    Ansari, Imran Shafique

    2015-05-01

    In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.

  16. Influence of the Testing Gage Length on the Strength, Young's Modulus and Weibull Modulus of Carbon Fibres and Glass Fibres

    Directory of Open Access Journals (Sweden)

    Luiz Claudio Pardini

    2002-10-01

    Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.

  17. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    Science.gov (United States)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  18. The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models.

    Science.gov (United States)

    Damgaard, L H; Korsgaard, I R; Simonsen, J; Dalsgaard, O; Andersen, A H

    2006-06-01

    The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels of individual heterogeneity combined with 4 levels of censoring (0, 25, 50, or 75%). Data were simulated according to balanced half-sib designs using Weibull log-normal animal frailty models with a normally distributed residual effect on the log-frailty scale. The 12 data sets were analyzed with 2 models: the sire model, equivalent to the animal model used to generate the data (complete sire model), and a corresponding model in which individual heterogeneity in log-frailty was neglected (incomplete sire model). Parameter estimates were obtained from a Bayesian analysis using Gibbs sampling, and also from the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero by an amount that depended in magnitude on the level of censoring and the size of the ignored individual heterogeneity. Despite the biased parameter estimates, the ranking of sires, measured by the rank correlations between true and estimated sire effects, was unaffected. In comparison, parameter estimates obtained using complete sire models were consistent with the true values used to simulate the data. Thus, in this study, several issues of concern were demonstrated for the incomplete sire model.

  19. Determining the parameters of Weibull function to estimate the wind power potential in conditions of limited source meteorological data

    Science.gov (United States)

    Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.

    2017-04-01

    We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for

  20. Flexural strength and Weibull analysis of a microhybrid and a nanofill composite evaluated by 3- and 4-point bending tests.

    Science.gov (United States)

    Rodrigues, Sinval A; Ferracane, Jack L; Della Bona, Alvaro

    2008-03-01

    The aim of the present study was to evaluate the flexural strength and the Weibull modulus of a microhybrid and a nanofill composite by means of 3- and 4-point bending tests. Thirty specimens of Filtek Z250 (3M/ESPE) and Filtek Supreme (3M/ESPE) were prepared for each test according to the ISO 4049/2000 specification. After 24h in distilled water at 37 degrees C the specimens were submitted to 3- and 4-point bending tests using a universal testing machine DL2000 (EMIC) with a crosshead speed of 1 mm/min. Flexural strength data were calculated and submitted to Student's t-test (alpha=0.05) and Weibull statistics. The fractured surfaces were analyzed based on fractographic principles. The two composites had equivalent strength in both test methods. However, the test designs significantly affected the flexural strength of the microhybrid and the nanofill composites. Weibull modulus (m) of Supreme was similar with both tests, while for Z250, a higher m was observed with the 3-point bending test. Critical flaws were most often associated with the specimen's surface (up to 90%) and were characterized as surface scratches/grooves, non-uniform distribution of phases, inclusions and voids. Flexural strength as measured by the 3-point bending test is higher than by the 4-point bending test, due to the smaller flaw containing area involved in the former. Despite the large difference in average filler size between the composites, the volume fraction of the filler in both materials is similar, which was probably the reason for similar mean flexural strength values and fracture behavior.

  1. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    Hirose, Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  2. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    Hirose, Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  3. Expectation maximization-based likelihood inference for flexible cure rate models with Weibull lifetimes.

    Science.gov (United States)

    Balakrishnan, Narayanaswamy; Pal, Suvra

    2016-08-01

    Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence.

  4. Production-inventory Management model for a weibull deteriorating item with linear demand and shortages

    OpenAIRE

    Gobinda Chandra Panda; Pravat Kumar Sukla

    2013-01-01

    Background: Physical decay or deterioration of goods in stock is an important feature of real inventory systems. Material and methods: In the present paper, we discuss an production inventory model for a Weibull deteriorating item over a finite planning horizon with a linearly time-varying demand rate and a uniform production rate, allowing shortages, which are completely backlogged. Results and conclusions:  A production inventory model is developed for a Weibull deteriorating...

  5. Comparison of Bayesian and Classical Analysis of Weibull Regression Model: A Simulation Study

    Directory of Open Access Journals (Sweden)

    İmran KURT ÖMÜRLÜ

    2011-01-01

    Full Text Available Objective: The purpose of this study was to compare performances of classical Weibull Regression Model (WRM and Bayesian-WRM under varying conditions using Monte Carlo simulations. Material and Methods: It was simulated the generated data by running for each of classical WRM and Bayesian-WRM under varying informative priors and sample sizes using our simulation algorithm. In simulation studies, n=50, 100 and 250 were for sample sizes, and informative prior values using a normal prior distribution with was selected for b1. For each situation, 1000 simulations were performed. Results: Bayesian-WRM with proper informative prior showed a good performance with too little bias. It was found out that bias of Bayesian-WRM increased while priors were becoming distant from reliability in all sample sizes. Furthermore, Bayesian-WRM obtained predictions with more little standard error than the classical WRM in both of small and big samples in the light of proper priors. Conclusion: In this simulation study, Bayesian-WRM showed better performance than classical method, when subjective data analysis performed by considering of expert opinions and historical knowledge about parameters. Consequently, Bayesian-WRM should be preferred in existence of reliable informative priors, in the contrast cases, classical WRM should be preferred.

  6. How to do a Weibull statistical analysis of flexural strength data: application to AlON, diamond, zinc selenide, and zinc sulfide

    Science.gov (United States)

    Klein, Claude A.; Miller, Richard P.

    2001-09-01

    For the purpose of assessing the strength of engineering ceramics, it is common practice to interpret the measured stresses at fracture in the light of a semi-empirical expression derived from Weibull's theory of brittle fracture, i.e., ln[-ln(1-P)]=-mln((sigma) N)+mln((sigma) ), where P is the cumulative failure probability, (sigma) is the applied tensile stress, m is the Weibull modulus, and (sigma) N is the nominal strength. The strength of (sigma) N, however, does not represent a true measure because it depends not only on the test method but also on the size of the volume or the surface subjected to tensile stresses. In this paper we intend to first clarify issues relating to the application of Weibull's theory of fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on polycrystalline infrared-transmitting materials. These materials are brittle ceramics, which most frequently fail as a consequence of tensile stresses acting on surface flaws. Since equibiaxial flexure testing is the preferred method of measuring the strength of optical ceramics, we propose to formulate the failure-probability equation in terms of a characteristic strength, (sigma) C, for biaxial loadings, i.e., P=1-exp{-(pi) (ro/cm)2[(Gamma) (1+1/m)]m((sigma) /(sigma) C)m}, where ro is the radius of the loading ring (in centimeter) and (Gamma) (z) designates the gamma function. A Weibull statistical analysis of equibiaxial strength data thus amounts to obtaining the parameters m and (sigma) C, which is best done by directly fitting estimated Pi vs i data to the failure-probability equation; this procedure avoids distorting the distribution through logarithmic linearization and can be implemented by performing a non-linear bivariate regression. Concentric- ring fracture testing performed on five sets of Raytran materials validates the procedure in the sense that the two parameters model appears to describe the experimental failure

  7. Optimization of Weibull deteriorating items inventory model under the effect of price and time dependent demand with partial backlogging

    Indian Academy of Sciences (India)

    SHIV KUMAR; ABHAY KUMAR SINGH; MANOJ KUMAR PATEL

    2016-09-01

    In this study, we have discussed the development of an inventory model when the deterioration rate of the item follows Weibull two parameter distributions under the effect of selling price and time dependent demand, since, not only the selling price, but also the time is a crucial factor to enhance the demand in the market as well as affecting the overall finance. In the present model, shortages are approved and also partially backlogged. Optimum inventory level, the optimal length of a cycle and the expressions for profit function under various cost considerations are obtained using differential equations. These are illustrated graphically with the help of numerical examples. The sensitivity analysis of the standards of the parameters has been performed tostudy the effect on inventory optimizations.

  8. Calculation of Wind Speeds for Return Period Using Weibull Parameter: A Case Study of Hanbit NPP Area

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jongk Uk; Lee, Kwan Hee; Kim, Sung Il; Yook, Dae Sik; Ahn, Sang Myeon [KINS, Daejeon (Korea, Republic of)

    2016-05-15

    Evaluation of the meteorological characteristics at the nuclear power plant and in the surrounding area should be performed in determining the site suitability for safe operation of the nuclear power plant. Under unexpected emergency condition, knowledge of meteorological information on the site area is important to provide the basis for estimating environmental impacts resulting from radioactive materials released in gaseous effluents during the accident condition. In the meteorological information, wind speed and direction are the important meteorological factors for examination of the safety analysis in the nuclear power plant area. Wind characteristics was analyzed on Hanbit NPP area. It was found that the Weibull parameters k and c vary 2.56 to 4.77 and 4.53 to 6.79 for directional wind speed distribution, respectively. Maximum wind frequency was NE and minimum was NNW.

  9. Estimating the creep strain to failure of PP at different load levels based on short term tests and Weibull characterization

    Directory of Open Access Journals (Sweden)

    L. M. Vas

    2012-12-01

    Full Text Available The short and long term creep behavior is one of the most important properties of polymers used for engineering applications. In order to study this kind of behavior of PP tensile and short term creep measurements were performed and analyzed using long term creep behavior estimating method based on short term tensile and creep tests performed at room temperature, viscoelastic behavior, and variable transformations. Applying Weibull distribution based approximations for the measured curves predictions for the creep strain to failure depending on the creep load were determined and the parameters were found by fitting the measurements. The upper, mean, and lower estimations as well as the confidence interval for the means give a possibility for designers' calculations at arbitrary creep load levels.

  10. The Weibull functional form for the energetic particle spectrum at interplanetary shock waves

    Science.gov (United States)

    Laurenza, M.; Consolini, G.; Storini, M.; Pallocchia, G.; Damiani, A.

    2016-11-01

    Transient interplanetary shock waves are often associated with high energy particle enhancements, which are called energetic storm particle (ESP) events. Here we present a case study of an ESP event, recorded by the SEPT, LET and HET instruments onboard the STEREO B spacecraft, on 3 October 2011, in a wide energy range from 0.1 MeV to ∼ 30 MeV. The obtained particle spectrum is found to be reproduced by a Weibull like shape. Moreover, we show that the Weibull spectrum can be theoretically derived as the asymptotic steady state solution of the diffusion loss equation by assuming anomalous diffusion for particle velocity. The evaluation of Weibull's parameters obtained from particle observations and the power spectral density of the turbulent fluctations in the shock region, support this scenario and suggest that stochastic acceleration can contribute significantly to the acceleration of high energetic particles at collisioness shock waves.

  11. Weibull approximation of LiDAR waveforms for estimating the beam attenuation coefficient.

    Science.gov (United States)

    Montes-Hugo, Martin A; Vuorenkoski, Anni K; Dalgleish, Fraser R; Ouyang, Bing

    2016-10-03

    Tank experiments were performed at different water turbidities to examine relationships between the beam attenuation coefficient (c) and Weibull shape parameters derived from LiDAR waveforms measured with the Fine Structure Underwater LiDAR (FSUIL). Optical inversions were made at 532 nm, within a c range of 0.045-1.52 m-1, and based on a LiDAR system having two field-of-view (15 and 75.7 mrad) and two linear polarizations. Consistently, the Weibull scale parameter or P2 showed the strongest covariation with c and was a more accurate proxy with respect to the LiDAR attenuation coefficient.

  12. Optimization of a small passive wind turbine based on mixed Weibull-turbulence statistics of wind

    OpenAIRE

    2008-01-01

    A "low cost full passive structure" of wind turbine system is proposed. The efficiency of such device can be obtained only if the design parameters are mutually adapted through an optimization design approach. An original wind profile generation process mixing Weibull and turbulence statistics is presented. The optimization results are compared with those obtained from a particular but typical time cycle of wind speed.

  13. Weibull statistics effective area and volume in the ball-on-ring testing method

    DEFF Research Database (Denmark)

    Frandsen, Henrik Lund

    2014-01-01

    The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...

  14. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Mis-specication under Weibull Lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, N

    2017-05-16

    In this paper, we develop likelihood inference based on the expectation maximization (EM) algorithm for the Box- Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model mis-specification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  15. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... distribution. Methods from structural reliability analysis are used to model the uncertainties and to assess the reliability for fatigue failure. Maximum Likelihood and Least Square estimation techniques are used to estimate fatigue life distribution parameters....

  16. On the gap between an empirical distribution and an exponential distribution of waiting times for price changes in a financial market

    CERN Document Server

    Sazuka, N

    2006-01-01

    We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in a non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a phase transition between a Weibull-law and a power-law in the asymptotic long waiting time regime.

  17. The effect of core material, veneering porcelain, and fabrication technique on the biaxial flexural strength and weibull analysis of selected dental ceramics.

    Science.gov (United States)

    Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean

    2012-07-01

    The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p strength (p= 0.004) than subgroup 0.8C-0.7VP. Nonetheless, both veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0

  18. An EOQ model with time dependent Weibull deterioration and ramp type demand ,

    Directory of Open Access Journals (Sweden)

    Chaitanya Kumar Tripathy

    2011-04-01

    Full Text Available This paper presents an order level inventory system with time dependent Weibull deterioration and ramp type demand rate where production and demand are time dependent. The proposed model of this paper considers economic order quantity under two different cases. The implementation of the proposed model is illustrated using some numerical examples. Sensitivity analysis is performed to show the effect of changes in the parameters on the optimum solution.

  19. ESTIMATION OF WEIBULL PARAMETERS USING A RANDOMIZED NEIGHBORHOOD SEARCH FOR THE SEVERITY OF FIRE ACCIDENTS

    Directory of Open Access Journals (Sweden)

    Soontorn Boonta

    2013-01-01

    Full Text Available In this study, we applied Randomized Neighborhood Search (RNS to estimate the Weibull parameters to determine the severity of fire accidents; the data were provided by the Thai Reinsurance Public Co., Ltd. We compared this technique with other frequently-used techniques: the Maximum Likelihood Estimator (MLE, the Method of Moments (MOM, the Least Squares Method (LSM and the weighted least squares method (WLSM and found that RNS estimates the parameters more accurately than do MLE, MOM, LSM or WLSM."

  20. An EOQ Model for Three parameter Weibull Deteriorating Item with Partial Backlogging

    Directory of Open Access Journals (Sweden)

    L.M. Pradhan

    2013-03-01

    Full Text Available Background: Business organisations are facing a lot of competition during these days. To withstand the competition and to remain in the front row, an enterprise should have optimum profitable plan for his business. Researchers in recent years have developed various inventory models for deteriorating items considering various practical situations. Partial backlogging is considerably a new concept introduced in developing various models for Weibull deteriorating items. Methodology: In this paper an inventory model has been developed considering three parameter Weibull deterioration of a single item with partial backlogging. Here demand rate is considered to be constant and lead time is zero. During the stock out period the backlogging rate is variable and is dependent on the length of the waiting time for the next replenishment. Results and conclusion: Optimal order quantity and total variable cost during a cycle has been derived for the proposed inventory model considering three parameter Weibull deteriorating item with partial backlogging. The results obtained in this paper are illustrated with the help of a numerical example and sensitivity analysis.

  1. Valoración de derivados europeos con mixtura de distribuciones Weibull

    Directory of Open Access Journals (Sweden)

    Andrés Mauricio Molina

    2015-07-01

    Full Text Available El modelo Black-Scholes para valoración de opciones europeas se usa bastante en el mercado por su fácil ejecución. Sin embargo, empieza a ser poco preciso en diferentes activos cuya dinámica no es de una distribución lognormal, por lo que se necesita buscar nuevas distribuciones para valorar opciones emitidas sobre diferentes activos subyacentes. Varios investigadores han trabajado en nuevas fórmulas de valoración de derivados suponiendo diferentes distribuciones ya sea para el precio del activo subyacente o para su retorno. Este artículo presenta dos fórmulas para valoración de activos: una modifica la fórmula usando una distribución de Weibull de dos parámetros propuesta por Savickas (2002 añadiendo dos nuevos parámetros (escala y localización y otra supone que la distribución del activo es una mixtura de distribuciones de Weibull. Se presentan también comparaciones de estos modelos con otros ya existentes como Black-Scholes y el modelo de Savickas con distribución Weibull simple.

  2. Effect on Stratum Gradient Frequency Distribution of Landslides in the Three Gorges Area of Northeast Chongqing

    Institute of Scientific and Technical Information of China (English)

    FAN Xiaoyi; QIAO Jianping

    2006-01-01

    The landslide data were calculated in the Three Gorges Area of northeast Chongqing. The results showed that landslide frequency distributions of gradients accorded with the Weibull probability density distribution function. The landslide hazard ratios of gradients were acquired by Weibull accumulation probability distribution function in the different geological units. There was discord between landslide hazard ratio of different geological units and variance of landslide gradient. But they were approximate homology in the strata of Jurassic. The results indicate that the Weibull distribution can quantitatively evaluate the landslide hazard ratios of gradients of the different strata in the Three Gorges Area.

  3. Weakest-Link Scaling and Finite Size Effects on Recurrence Times Distribution

    CERN Document Server

    Hristopulos, Dionissios T; Kaniadakis, Giorgio

    2013-01-01

    Tectonic earthquakes result from the fracturing of the Earth's crust due to the loading induced by the motion of the tectonic plates. Hence, the statistical laws of earthquakes must be intimately connected to the statistical laws of fracture. The Weibull distribution is a commonly used model of earthquake recurrence times (ERT). Nevertheless, deviations from Weibull scaling have been observed in ERT data and in fracture experiments on quasi-brittle materials. We propose that the weakest-link-scaling theory for finite-size systems leads to the kappa-Weibull function, which implies a power-law tail for the ERT distribution. We show that the ERT hazard rate function decreases linearly after a waiting time which is proportional to the system size (in terms of representative volume elements) raised to the inverse of the Weibull modulus. We also demonstrate that the kappa-Weibull can be applied to strongly correlated systems by means of simulations of a fiber bundle model.

  4. Breeding biology of muscovy duck (Cairina moschata) under natural incubation: the use of the weibull function and a beta-binomial model to predict nest hatchability

    Science.gov (United States)

    Harun; Draisma; Frankena; Veeneklaas; Van Kampen M

    1999-05-07

    In this paper we tested the Weibull function and beta-binomial distribution to analyse and predict nest hatchability, using empirical data on hatchability in Muscovy duck (Cairina moschata) eggs under natural incubation (932 successfully incubated nests and 11 822 eggs). The estimated parameters of the Weibull function and beta-binomial model were compared with the logistic regression analysis. The maximum likelihood estimation of the parameters was used to quantify simultaneously the influence of the nesting behaviour and the duration of the reproduction cycle on hatchability. The estimated parameters showed that the hatchability was not affected in natural dump nests, but in artificial dump nests and in nests with non-term eggs the hatchability was reduced by 10 and 25%, respectively. Similar results were obtained using logistic regression. Both models provided a satisfactory description of the observed data set, but the beta-binomial model proved to have more parameters with practical and biological meaningful interpretations, because this model is able to quantify and incorporate the unexplained variation in a single parameter theta (which is a variance measure). Copyright 1999 Academic Press.

  5. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    Science.gov (United States)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  6. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    Science.gov (United States)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2017-02-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  7. Weibull model as a tool for assessment of operation reliability in a sewage treatment plant in Niepołomice

    Directory of Open Access Journals (Sweden)

    Ewa Wąsik

    2016-06-01

    Full Text Available The article presents the reliability of municipal sewage treatment plant in an area Niepołomicka Industrial Zone. The analysis is based on five indicators of pollution: BOD5, CODCr, total suspension, total nitrogen and total phosphorus. Samples of treated sewage were collected once a month in the period from January 2011 to December 2013. The paper presents an analysis of the effectiveness of individual indicators and identify their basic statistical characteristics. Studies have shown that wastewater treatment plant in Niepołomice is characterized by high efficiency of pollutants removal with mean effectiveness of BOD5 – 98.8%, CODCr – 97.0%, total suspension – 97.3%, total nitrogen – 88.6%, and total phosphorus – 97.0%. The calculated forecast reliability of the discussed treatment plant based on the distribution of indicators in treated wastewater using Weibull model showed, that the facility meet the requirements for removal of these indicators for 365 days in the case of BOD5, CODCr, suspended solids and total phosphorus, while for total nitrogen – 336 days a year.

  8. Some challenges of wind modelling for modern wind turbines: The Weibull distribution

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier;

    2012-01-01

    Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer...... depends on the planetary boundary layer (PBL) structure and height, thus parameters that are not accounted for in today’s traditional applied flow simulation models and parameterizations. Here we report on one year of measurements of the wind profile performed by use of a long range wind lidar (WSL 70) up...... to a height of 600 meters with 50 meters resolution. The lidar is located at a flat coastal site. The applicability of the WRF model to predict some of the important parameters for wind energy has been investigated. In this presentation, some general results on the ability of WRF to predict the wind profile...

  9. Some challenges of wind modelling for modern wind turbines: The Weibull distribution

    OpenAIRE

    Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier; Pena Diaz, Alfredo

    2012-01-01

    Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer depends on the planetary boundary layer (PBL) structure and height, thus parameters that are not accounted for in today’s traditional applied flow simulation models and parameterizations. Here we r...

  10. Survival Analysis of Patients with Breast Cancer using Weibull Parametric Model.

    Science.gov (United States)

    Baghestani, Ahmad Reza; Moghaddam, Sahar Saeedi; Majd, Hamid Alavi; Akbari, Mohammad Esmaeil; Nafissi, Nahid; Gohari, Kimiya

    2015-01-01

    The Cox model is known as one of the most frequently-used methods for analyzing survival data. However, in some situations parametric methods may provide better estimates. In this study, a Weibull parametric model was employed to assess possible prognostic factors that may affect the survival of patients with breast cancer. We studied 438 patients with breast cancer who visited and were treated at the Cancer Research Center in Shahid Beheshti University of Medical Sciences during 1992 to 2012; the patients were followed up until October 2014. Patients or family members were contacted via telephone calls to confirm whether they were still alive. Clinical, pathological, and biological variables as potential prognostic factors were entered in univariate and multivariate analyses. The log-rank test and the Weibull parametric model with a forward approach, respectively, were used for univariate and multivariate analyses. All analyses were performed using STATA version 11. A P-value lower than 0.05 was defined as significant. On univariate analysis, age at diagnosis, level of education, type of surgery, lymph node status, tumor size, stage, histologic grade, estrogen receptor, progesterone receptor, and lymphovascular invasion had a statistically significant effect on survival time. On multivariate analysis, lymph node status, stage, histologic grade, and lymphovascular invasion were statistically significant. The one-year overall survival rate was 98%. Based on these data and using Weibull parametric model with a forward approach, we found out that patients with lymphovascular invasion were at 2.13 times greater risk of death due to breast cancer.

  11. Weibull analysis and flexural strength of hot-pressed core and veneered ceramic structures.

    Science.gov (United States)

    Bona, Alvaro Della; Anusavice, Kenneth J; DeHoff, Paul H

    2003-11-01

    To test the hypothesis that the Weibull moduli of single- and multilayer ceramics are controlled primarily by the structural reliability of the core ceramic.Methods. Seven groups of 20 bar specimens (25 x 4 x 1.2 mm) were made from the following materials: (1) IPS Empress--a hot-pressed (HP) leucite-based core ceramic; (2) IPS Empress2--a HP lithia-based core ceramic; (3 and 7) Evision--a HP lithia-based core ceramic (ES); (4) IPS Empress2 body--a glass veneer; (5) ES (1.1 mm thick) plus a glaze layer (0.1 mm); and (6) ES (0.8 mm thick) plus veneer (0.3 mm) and glaze (0.1 mm). Each specimen was subjected to four-point flexure loading at a cross-head speed of 0.5 mm/min while immersed in distilled water at 37 degrees C, except for Group 7 that was tested in a dry environment. Failure loads were recorded and the fracture surfaces were examined using SEM. ANOVA and Duncan's multiple range test were used for statistical analysis. No significant differences were found between the mean flexural strength values of Groups 2, 3, 5, and 6 or between Groups 1 and 4 (p>0.05). However, significant differences were found for dry (Group 7) and wet (Groups 1-6) conditions. Glazing had no significant effect on the flexural strength or Weibull modulus. The strength and Weibull modulus of the ES ceramic were similar to those of Groups 5 and 6. The structural reliability of veneered core ceramic is controlled primarily by that of the core ceramic.

  12. Channel capacity and digital modulation schemes in correlated Weibull fading channels with nonidentical statistics

    Institute of Scientific and Technical Information of China (English)

    Xiao Hailin; Nie Zaiping; Yang Shiwen

    2007-01-01

    The novel closed-form expressions for the average channel capacity of dual selection diversity is presented, as well as, the bit-error rate (BER) of several coherent and noncoherent digital modulation schemes in the correlated Weibull fading channels with nonidentical statistics.The results are expressed in terms of Meijer's Gfunction, which can be easily evaluated numerically.The simulation results are presented to validate the proposed theoretical analysis and to examine the effects of the fading severity on the concerned quantities.

  13. Compressed data separation via dual frames based split-analysis with Weibull matrices

    Institute of Scientific and Technical Information of China (English)

    CAI Yun; LI Song

    2013-01-01

    In this paper, we consider data separation problem, where the original signal is composed of two distinct subcomponents, via dual frames based Split-analysis approach. We show that the two distinct subcomponents, which are sparse in two diff erent general frames respectively, can be exactly recovered with high probability, when the measurement matrix is a Weibull random matrix (not Gaussian) and the two frames satisfy a mutual coherence property. Our result may be significant for analysing Split-analysis model for data separation.

  14. Parameter estimation for stochastic diffusion process with drift proportional to Weibull density function

    OpenAIRE

    Hammou Elotmany; M'Hamed Eddahbi

    2015-01-01

    Hammou El-otmany, M'hamed Eddahbi Facult{\\'e} des Sciences et Techniques Marrakech-Maroc Laboratoire de m{\\'e}thodes stochastiques appliqu{\\'e}e a la finance et actuariat (LaMsaFA) Abstract. In the present paper we propose a new stochastic diffusion process with drift proportional to the Weibull density function defined as X $\\epsilon$ = x, dX t = $\\gamma$ t (1 -- t $\\gamma$+1) -- t $\\gamma$ X t dt + $\\sigma$X t dB t , t \\textgreater{} 0, with parameters $\\gamma$ \\textgreater{} 0 and $\\sigma$...

  15. The effect of ignoring individual heterogeneity in Weibull log-normal sire frailty models

    DEFF Research Database (Denmark)

    Damgaard, Lars Holm; Korsgaard, Inge Riis; Simonsen, J;

    2006-01-01

    The objective of this study was, by means of simulation, to quantify the effect of ignoring individual heterogeneity in Weibull sire frailty models on parameter estimates and to address the consequences for genetic inferences. Three simulation studies were evaluated, which included 3 levels...... the software Survival Kit for the incomplete sire model. For the incomplete sire model, the Monte Carlo and Survival Kit parameter estimates were similar. This study established that when unobserved individual heterogeneity was ignored, the parameter estimates that included sire effects were biased toward zero...

  16. Durability and Weibull Characteristics of Lithium Disilicate Crowns Bonded on Abutments with Knife-Edge and Large Chamfer Finish Lines after Cyclic Loading.

    Science.gov (United States)

    Cortellini, Davide; Canale, Angelo; Souza, Rodrigo O A; Campos, Fernanda; Lima, Julia C; Özcan, Mutlu

    2015-12-01

    The aim of this study was to evaluate the durability of lithium disilicate crowns bonded on abutments prepared with two types of finish lines after long-term cyclic loading. Pressed lithium disilicate all-ceramic molar crowns were bonded (Variolink II) to epoxy abutments (height: 5.5 mm, Ø: 7.5 mm, conicity: 6°) (N = 20) with either knife-edge (KE) or large chamfer (LC) finish lines. Each assembly was submitted to cyclic loading (1,200,000×; 200 N; 1 Hz) in water and then tested until fracture in a universal testing machine (1 mm/min). Failure types were classified and further evaluated under stereomicroscope and SEM. The data (N) were analyzed using one-way ANOVA. Weibull distribution values including the Weibull modulus (m), characteristic strength (0), probability of failure at 5% (0.05), 1% (0.01), and correlation coefficient were calculated. Type of finish line did not significantly influence the mean fracture strength of pressed ceramic crowns (KE: 1655 ± 353 N; LC: 1618 ± 263 N) (p = 0.7898). Weibull distribution presented lower shape value (m) of KE (m = 5.48; CI: 3.5 to 8.6) compared to LC (m = 7.68; CI: 5.2 to 11.3). Characteristic strengths (0) (KE: 1784.9 N; LC: 1712.1 N) were higher than probability of failure at 5% (0.05) (KE: 1038.1 N; LC: 1163.4 N) followed by 1% (0.01) (KE: 771 N; LC: 941.1 N), with a correlation coefficient of 0.966 for KE and 0.924 for LC. Type V failures (severe fracture of the crown and/or tooth) were more common in both groups. SEM findings showed that fractures occurred mainly from the cement/ceramic interface at the occlusal side of the crowns. Lithium disilicate ceramic crowns bonded onto abutment teeth with KE preparation resulted in similar fracture strength to those bonded on abutments with LC finish line. Pressed lithium disilicate ceramic crowns may not require invasive finish line preparations since finish line type did not impair the strength after aging conditions. © 2015 by the American College of

  17. Strength Distribution Analysis of Typical Staple Fibers

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The strength of staple fiber is an important property for yarns and fabrics. Usually there are variations in individual fiber strength, and this will affect the final strength of yarns and fabrics. In this study, Weibull distribution function is used to analyze the strength distribution of various staplefibers. The strengths of wool, silk, cotton, flax, acrylic, polyester, glass, aramid and carbon fiber are tested. It isfound that the strengths of cotton, polyester, glass, aramid and carbon fiber fit well with the two-factor Weibulldistribution, while those of wool and silk with the threefactir Weibull distribution. However, the strength distributionof flax cannot be expressed by either two- or three-factor Weibull distribution convincingly.

  18. An EOQ Model for Time Dependent Weibull Deterioration with Linear Demand and Shortages

    Directory of Open Access Journals (Sweden)

    Umakanta Mishra

    2012-06-01

    Full Text Available Background. The study of control and maintenance of production inventories of deteriorating items with and without shortages  has grown in its importance recently. The effect of deterioration is very important in many inventory systems. Deterioration is defined as decay or damage such that the item cannot be used for its original purpose. Methods: In this article order level inventory models have been developed for deteriorating items with linear demand and Weibull deterioration.  In developing the model we have assumed that the production rate and the demand rate are time dependent. The unit production cost is inversely proportional to demand. Inventory-production system has two parameters Weibull deterioration.  Results and conclusions:  Two models have been developed considering without shortage cases and with shortage case where the shortages are completely backlogged. The objective of the model is to develop an optimal policy that minimizes the total average cost. Sensitivity analysis has been carried out to show the effect of changes in the parameter on the optimum total average cost.  

  19. Transferability of Charpy Absorbed Energy to Fracture Toughness Based on Weibull Stress Criterion

    Institute of Scientific and Technical Information of China (English)

    Hongyang JING; Lianyong XU; Lixing HUO; Fumiyoshi Minami

    2005-01-01

    The relationship between Charpy absorbed energy and the fracture toughness by means of the (crack tip opening displacement (CTOD)) method was analyzed based on the Weibull stress criterion. The Charpy absorbed energy and the fracture toughness were measured for the SN490B steel under the ductile-brittle transition temperature region. For the instrumented Charpy impact test, the curves between the loading point displacement and the load against time were recorded. The critical Weibull stress was taken as a fracture controlled parameter, and it could not be affected by the specimen configuration and the loading pattern based on the local approach. The parameters controlled brittle fracture are obtained from the Charpy absorbed energy results, then the fracture toughness for the compact tension (CT) specimen is predicted. It is found that the results predicted are in good agreement with the experimental. The fracture toughness could be evaluated by the Charpy absorbed energy, because the local approach gives a good description for the brittle fracture even though the Charpy impact specimen or the CT specimen is used for the given material.

  20. Weibull 统计理论的参数对混凝土全曲线模型的影响%Application of Weibull Statistical Theory in Concrete’s Parameter Curve Model

    Institute of Scientific and Technical Information of China (English)

    潘青松; 彭刚; 胡伟华; 徐鑫

    2015-01-01

    为了解混凝土在不同加载速率下的力学特性,采用微机控制电液伺服大型多功能动静力三轴仪,对强度等级为 C15、边长为150 mm 的立方体混凝土试件在不同加载速率为10-5/s,10-4/s,10-3/s,5×10-3/s 下进行了单轴压缩试验,对不同加载速率下单轴压缩混凝土的抗压强度、变形、基于修正后的 Weibull 统计理论的应力应变全曲线模型参数等进行了研究和分析。结果表明:修正后的 Weibull 统计理论模型能较好地拟合混凝土试件在不同加载速率下的全曲线模型;材料的强度硬化特性可以通过 Weibull 本构模型中的参数 m 和 E 值表征;应变软化特性可以通过 Weibull 本构模型中的参数 c 值表征。%In order to understand the mechanical properties of concrete under different loading rates,we conducted uniaxial compression test on cubic concrete specimens (strength C15,side length 150mm)under different loading rates (10 -5 /s,10 -4 /s,10 -3 /s,5 ×10 -3 /s).The test was carried out by micro-computer controlled electro-hydrau-lic servo static and dynamic multifunctional triaxial apparatus.The compressive strength of concrete under uniaxial compression,deformation,and stress-strain curve model parameters based on the statistical theory of modified Weibull were studied and analyzed.Results revealed that the modified Weibull strain curve model parameters well fit the complete curve model of concrete specimens under different loading rates.The strength hardening properties could be characterized by the values of parameter m and E in the Weibull constitutive model,and the strain soften-ing behavior can be expressed by parameter c in the constitutive Weibull model.

  1. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin [Center for Device Thermography and Reliability (CDTR), H. H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom)

    2015-01-26

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.

  2. Foam-forming properties of Ilex paraguariensis (mate saponin: foamability and foam lifetime analysis by Weibull equation

    Directory of Open Access Journals (Sweden)

    Janine Treter

    2010-01-01

    Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.

  3. Estimation of the operational reliability determined with Weibull modulus based on the abrasive wear in a cylinder-piston ring system

    Directory of Open Access Journals (Sweden)

    J. Piątkowski

    2012-12-01

    Full Text Available Purpose: The main purpose of the study was to determine methodology for estimation of the operational reliability based on the statistical results of abrasive wear testing.Design/methodology/approach: For research, a traditional tribological system, i.e. a friction pair of the AlSi17CuNiMg silumin in contact with the spheroidal graphite cast iron of EN-GJN-200 grade, was chosen. Conditions of dry friction were assumed. This system was chosen based on mechanical cooperation between the cylinder (silumin and piston rings (spheroidal graphite cast iron in conventional internal combustion piston engines with spark ignition.Findings: Using material parameters of the cylinder and piston rings, nominal losses qualifying the cylinder for repair and the maximum weight losses that can be smothered were determined. Based on the theoretical number of engine revolutions to repair and stress acting on the cylinder bearing surface, the maximum distance that the motor vehicle can travel before the seizure of the cylinder occurs was calculated. These results were the basis for statistical analysis carried out with the Weibull modulus, the end result of which was the estimation of material reliability (the survival probability of tribological system and the determination of a pre-operation warranty period of the tribological system.Research limitations/implications: The analysis of Weibull distribution modulus will estimate the reliability of a tribological cylinder-ring system enabled the determination of an approximate theoretical time of the combustion engine failure-free running.Originality/value: The results are valuable statistical data and methodology proposed in this paper can be used to determine a theoretical life time of the combustion engine.

  4. PERFORMANCE EVALUATION OF OSSO-CFAR WITH BINARY INTEGRATION IN WEIBULL BACKGROUND

    Institute of Scientific and Technical Information of China (English)

    Meng Xiangwei

    2013-01-01

    The performance of the Ordered-Statistic Smallest Of (OSSO) Constant False Alarm Rate (CFAR) with binary integration in Weibull background with known shape parameter is analyzed,in the cases that the processor operates in homogeneous background and non-homogeneous situation caused by multiple targets and clutter edge.The analytical models of this scheme for the performance evaluation are given.It is shown that the OSSO-CFAR with binary integration can greatly improve the detection performance with respect to the single pulse processing case.As the clutter background becomes spiky,a high threshold S of binary integration (S/M) is required in order to obtain a good detection performance in homogeneous background.Moreover,the false alarm performance of the OSSO-CFAR with binary integration is more sensitive to the changes of shape parameter or power level of the clutter background.

  5. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  6. A study of optimization problem for amplify-and-forward relaying over weibull fading channels

    KAUST Repository

    Ikki, Salama Said

    2010-09-01

    This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.

  7. Performance of Statistical Control Charts with Bilateral Limits of Probability to Monitor Processes Weibull in Maintenance

    Directory of Open Access Journals (Sweden)

    Quintana Alicia Esther

    2015-01-01

    Full Text Available Manufacturing with optimal quality standards is underpinned to the high reliability of its equipment and systems, among other essential pillars. Maintenance Engineering is responsible for planning control and continuous improvement of its critical equipment by any approach, such as Six Sigma. This is nourished by numerous statistical tools highlighting, among them, statistical process control charts. While their first applications were in production, other designs have emerged to adapt to new needs as monitoring equipment and systems in the manufacturing environment. The time between failures usually fits an exponential or Weibull model. The t chart and adjusted t chart, with probabilistic control limits, are suitable alternatives to monitor the mean time between failures. Unfortunately, it is difficult to find publications of them applied to the models Weibull, very useful in contexts such as maintenance. In addition, literature limits the study of their performance to the analysis of the standard metric average run length, thus giving a partial view. The aim of this paper is to explore the performance of the t chart and adjusted t chart using three metrics, two unconventional. To do this, it incorporates the concept of lateral variability, in their forms left and right variability. Major precisions of the behavior of these charts allow to understand the conditions under which are suitable: if the main objective of monitoring lies in detecting deterioration, the t chart with adjustment is recommended. On the other hand, when the priority is to detect improvements, the t chart without adjustment is the best choice. However, the response speed of both charts is very variable from run to run.

  8. Development of a Weibull model of cleavage fracture toughness for shallow flaws in reactor pressure vessel material

    Energy Technology Data Exchange (ETDEWEB)

    Bass, B.R.; Williams, P.T.; McAfee, W.J.; Pugh, C.E. [Oak Ridge National Lab., Heavy-Section Steel Technology Program, Oak Ridge, TN (United States)

    2001-07-01

    A primary objective of the United States Nuclear Regulatory Commission (USNRC) -sponsored Heavy-Section Steel Technology (HSST) Program is to develop and validate technology applicable to quantitative assessments of fracture prevention margins in nuclear reactor pressure vessels (RPVs) containing flaws and subjected to service-induced material toughness degradation. This paper describes an experimental/analytical program for the development of a Weibull statistical model of cleavage fracture toughness for applications to shallow surface-breaking and embedded flaws in RPV materials subjected to multi-axial loading conditions. The experimental part includes both material characterization testing and larger fracture toughness experiments conducted using a special-purpose cruciform beam specimen developed by Oak Ridge National Laboratory for applying biaxial loads to shallow cracks. Test materials (pressure vessel steels) included plate product forms (conforming to ASTM A533 Grade B Class 1 specifications) and shell segments procured from a pressurized-water reactor vessel intended for a nuclear power plant. Results from tests performed on cruciform specimens demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower-transition temperature region. A local approach methodology based on a three-parameter Weibull model was developed to correlate these experimentally-observed biaxial effects on fracture toughness. The Weibull model, combined with a new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the Weibull stress integral definition, is shown to provide a scaling mechanism between uniaxial and biaxial loading states for 2-dimensional flaws located in the A533-B plate material. The Weibull stress density was introduced as a matrice for identifying regions along a semi-elliptical flaw front that have a higher probability of cleavage initiation. Cumulative

  9. Modelo de projeção por classe diamétrica para florestas nativas: enfoque na função probabilística de Weibull Projection model by diameter class for native forests: focus on the Weibull probability function

    Directory of Open Access Journals (Sweden)

    Rodrigo Geroni Mendes Nascimento

    2012-06-01

    Full Text Available

    Em 1979 a técnica de modelagem de distribuições diamétricas por funções probabilísticas foi aplicada pela primeira vez por Hyink & Moser na prognose do crescimento e da produção de florestas multiâneas e heterogêneas. Entretanto, atualmente, poucos trabalhos a utilizam no planejamento da produção dessas florestas por desconhecerem a viabilidade operacional da técnica. Sendo assim, esse trabalho visa apresentar uma revisão das características que propiciam a modelagem do crescimento e da produção por classe diamétrica, destacando a importância da dinâmica do recrutamento, mortalidade, sobrevivência, bem como dos atributos populacionais correlacionados à modelagem da distribuição de Weibull, apresentando as particularidades estatísticas utilizadas na modelagem da produção por esse método.

    doi: 10.4336/2012.pfb.32.70.93

    In 1979 the technique of modeling diameter distributions by probabilistic functions was first applied for Hyink & Moser in forecasting growth and production of uneven aged and heterogeneous forests. However, today few studies use this method for planning the production in these forests for not knowing the operational feasibility of the technique. Therefore this paper presents a review of the characteristics that allow the modeling of growth and yield by diameter class, highlighting the importance of the dynamics of recruitment, mortality, survival, and population of attributes related to the modeling of Weibull distribution, with the specific statistics used in the modeling of yield by this method.

    doi: 10.4336/2012.pfb.32.70.93

  10. The distribution of first-passage times and durations in FOREX and future markets

    Science.gov (United States)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  11. Area and Flux Distributions of Active Regions, Sunspot Groups, and Sunspots: A Multi-Database Study

    CERN Document Server

    Muñoz-Jaramillo, Andrés; Windmueller, John C; Amouzou, Ernest C; Longcope, Dana W; Tlatov, Andrey G; Nagovitsyn, Yury A; Pevtsov, Alexei A; Chapman, Gary A; Cookson, Angela M; Yeates, Anthony R; Watson, Fraser T; Balmaceda, Laura A; DeLuca, Edward E; Martens, Petrus C H

    2014-01-01

    In this work we take advantage of eleven different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions -- where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) $10^{21}$Mx ($10^{22}$Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behaviour of a power-law distribution (when extended into smaller fluxes), making our results compatible with the results of Parnell et al.\\ (200...

  12. Linear vs. piecewise Weibull model for genetic evaluation of sires for longevity in Simmental cattle

    Directory of Open Access Journals (Sweden)

    Nikola Raguž

    2014-09-01

    Full Text Available This study was focused on genetic evaluation of longevity in Croatian Simmental cattle using linear and survival models. The main objective was to create a genetic model that is most appropriate to describe the longevity data. Survival analysis, using piecewise Weibull proportional hazards model, used all information on the length of productive life including censored as well as uncensored observations. Linear models considered culled animals only. The relative milk production within herd had a highest impact on cows’ longevity. In comparison of estimated genetic parameters among methods, survival analysis yielded higher heritability value (0.075 than linear sire (0.037 and linear animal model (0.056. When linear models were used, genetic trend of Simmental bulls for longevity was slightly increasing over the years, unlike a decreasing trend in case of survival analysis methodology. Average reliability of bulls’ breeding values was higher in case of survival analysis. The rank correlations between survival analysis and linear models bulls’ breeding values for longevity were ranged between 0.44 and 0.46 implying huge differences in ranking of sires.

  13. Modeling AIDS survival after initiation of antiretroviral treatment by Weibull models with changepoints

    Directory of Open Access Journals (Sweden)

    Yiannoutsos Constantin T

    2009-06-01

    Full Text Available Abstract Background Mortality of HIV-infected patients initiating antiretroviral therapy in the developing world is very high immediately after the start of ART therapy and drops sharply thereafter. It is necessary to use models of survival time that reflect this change. Methods In this endeavor, parametric models with changepoints such as Weibull models can be useful in order to explicitly model the underlying failure process, even in the case where abrupt changes in the mortality rate are present. Estimation of the temporal location of possible mortality changepoints has important implications on the effective management of these patients. We briefly describe these models and apply them to the case of estimating survival among HIV-infected patients who are initiating antiretroviral therapy in a care and treatment programme in sub-Saharan Africa. Results As a first reported data-driven estimate of the existence and location of early mortality changepoints after antiretroviral therapy initiation, we show that there is an early change in risk of death at three months, followed by an intermediate risk period lasting up to 10 months after therapy. Conclusion By explicitly modelling the underlying abrupt changes in mortality risk after initiation of antiretroviral therapy we are able to estimate their number and location in a rigorous, data-driven manner. The existence of a high early risk of death after initiation of antiretroviral therapy and the determination of its duration has direct implications for the optimal management of patients initiating therapy in this setting.

  14. The impact of Weibull data and autocorrelation on the performance of the Shewhart and exponentially weighted moving average control charts

    Directory of Open Access Journals (Sweden)

    Gary Black

    2011-01-01

    Full Text Available Many real-world processes generate autocorrelated and/or Weibull data. In such cases, the independence and/or normality assumptions underlying the Shewhart and EWMA control charts are invalid. Although data transformations exist, such tools would not normally be understood or employed by naive practitioners. Thus, the question arises, “What are the effects on robustness whenever these charts are used in such applications?” Consequently, this paper examines and compares the performance of these two control charts when the problem (the model is subjected to autocorrelated and/or Weibull data. A variety of conditions are investigated related to the magnitudes of various parameters related to the process shift, the autocorrelation coefficient and the Weibull shape parameter. Results indicate that the EWMA chart outperforms the Shewhart in 62% of the cases, particularly those cases with low to moderate autocorrelation effects. The Shewhart chart outperforms the EWMA chart in 35% of the cases, particularly those cases with high autocorrelation and zero or high process shift effects.

  15. Studies on Some Inventory Model for Deteriorating Items with Weibull Replinishment and Generalised Pareto Decay Having Demands as Function of on Hand Inventory

    Directory of Open Access Journals (Sweden)

    A Lakshmana Rao

    2015-02-01

    Full Text Available Inventory models play an important role in determining the optimal ordering and pricing policies. Much work has been reported in literature regarding inventory models with finite or infinite replenishment. But in many practical situations the replenishment is governed by random factors like procurement, transportation, environmental condition, availability of raw material etc., Hence, it is needed to develop inventory models with random replenishment. In this paper, an EPQ model for deteriorating items is developed and analyzed with the assumption that the replenishment is random and follows a Weibull distribution. It is further assumed that the life time of a commodity is random and follows a generalized Pareto distribution and demand is a function of on hand inventory. Using the differential equations, the instantaneous state of inventory is derived. With suitable cost considerations, the total cost function is obtained. By minimizing the total cost function, the optimal ordering policies are derived. Through numerical illustrations, the sensitivity analysis is carried. The sensitivity analysis of the model reveals that the random replenishment has significant influence on the ordering and pricing policies of the model. This model also includes some of the earlier models as particular cases for specific values of the parameters.

  16. A New Lifetime Distribution with Bathtube and Unimodal Hazard Function

    Science.gov (United States)

    Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.

    2008-11-01

    In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.

  17. On the q-type distributions

    Science.gov (United States)

    Nadarajah, Saralees; Kotz, Samuel

    2007-04-01

    Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236

  18. Fitting the empirical distribution of intertrade durations

    Science.gov (United States)

    Politi, Mauro; Scalas, Enrico

    2008-03-01

    Based on the analysis of a tick-by-tick data set used in the previous work by one of the authors (DJIA stocks traded at NYSE in October 1999), in this paper, we reject the hypothesis that tails of the empirical intertrade distribution are described by a power law. We further argue that the Tsallis q-exponentials are a viable tool for fitting and describing the unconditional distribution of empirical intertrade durations and they compare well to the Weibull distribution.

  19. An EOQ model for three parameter Weibull deterioration with permissible delay in payments and associated salvage value

    Directory of Open Access Journals (Sweden)

    L. M. Pradhan

    2012-01-01

    Full Text Available This paper deals with the development of an inventory model for Weibull deteriorating items with constant demand when delay in payments is allowed to the retailer to settle the account against the purchases made. Shortages are not allowed and the salvage value is associated with the deteriorated units. In this paper, we consider two cases; those are for the case payment within the permissible time and for payment after the expiry of permissible time with interest. Numerical examples are provided to illustrate our results. Sensitivity analysis are carried out to analyze the effect of changes in the optimal solution with respect to change in one parameter at a time.

  20. Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case

    Energy Technology Data Exchange (ETDEWEB)

    Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas

    2004-08-01

    The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)

  1. Specimen type and size effects on lithium hydride tensile strength distributions

    Energy Technology Data Exchange (ETDEWEB)

    Oakes, Jr, R E

    1991-12-01

    Weibull's two-parameter statistical-distribution function is used to account for the effects of specimen size and loading differences on strength distributions of lithium hydride. Three distinctly differing uniaxial specimen types (i.e., an elliptical-transition pure tensile specimen, an internally pressurized ring tensile, and two sizes of four-point-flexure specimens) are shown to provide different strength distributions as expected, because of their differing sizes and modes of loading. After separation of strengths into volumetric- and surface-initiated failure distributions, the Weibull characteristic strength parameters for the higher-strength tests associated with internal fracture initiations are shown to vary as predicted by the effective specimen volume Weibull relationship. Lower-strength results correlate with the effective area to much lesser degree, probably because of the limited number of surface-related failures and the different machining methods used to prepare the specimen. The strength distribution from the fourth specimen type, the predominantly equibiaxially stressed disk-flexure specimen, is well below that predicted by the two-parameter Weibull-derived effective volume or surface area relations. The two-parameter Weibull model cannot account for the increased failure probability associated with multiaxial stress fields. Derivations of effective volume and area relationships for those specimens for which none were found in the literature, the elliptical-transition tensile, the ring tensile, and the disk flexure (including the outer region), are also included.

  2. Use of MinMaxEnt distributions defined on basis of MaxEnt method in wind power study

    Energy Technology Data Exchange (ETDEWEB)

    Shamilov, Aladdin; Kantar, Yeliz Mert; Usta, Ilhan [Department of Statistics, Anadolu University, Eskisehir 26470 (Turkey)

    2008-04-15

    Knowledge of the wind speed distribution is an important information needed in evaluation of wind power potential. Several statistical distributions have been used to study wind data. The Weibull distribution is the most popular due to its ability to fit most accurately the variety of wind speed data measured at different geographical locations throughout the world. Recently, maximum entropy (MaxEnt) distributions based on the maximum entropy method have been widely used to determine wind speed distribution. Li and Li used the MaxEnt distribution for the first time in the wind energy field and proposed a theoretical approach to determine the distribution of wind speed data analytically. Ramirez and Carta discussed the use of wind probability distributions derived from the maximum entropy principle in the analysis of wind energy. In this study, MinMaxEnt distributions defined on the basis of the MaxEnt method are introduced and applied to find wind distribution and wind power density. A comparison of the MinMaxEnt and Weibull distributions on wind speed data taken from different sources and measured in various regions is conducted. The wind power densities of the considered regions obtained from the Weibull and MinMaxEnt distributions are also compared. The results indicate that the MinMaxEnt distributions obtained show better results than the known Weibull distribution for wind speed distributions and wind power density. Therefore, MinMaxEnt distributions can be used to estimate wind distributions and wind power potential. (author)

  3. Approximation of the breast height diameter distribution of two-cohort stands by mixture models I Parameter estimation

    Science.gov (United States)

    Rafal Podlaski; Francis A. Roesch

    2013-01-01

    Study assessed the usefulness of various methods for choosing the initial values for the numerical procedures for estimating the parameters of mixture distributions and analysed variety of mixture models to approximate empirical diameter at breast height (dbh) distributions. Two-component mixtures of either the Weibull distribution or the gamma distribution were...

  4. Influence of Hydrophilic Polymers on the β Factor in Weibull Equation Applied to the Release Kinetics of a Biologically Active Complex of Aesculus hippocastanum

    Directory of Open Access Journals (Sweden)

    Justyna Kobryń

    2017-01-01

    Full Text Available Triterpenoid saponins complex of biological origin, escin, exhibits significant clinical activity in chronic venous insufficiency, skin inflammation, epidermal abrasions, allergic dermatitis, and acute impact injuries, especially in topical application. The aim of the study is the comparison of various hydrogel formulations, as carriers for a horse chestnut seed extract (EH. Methylcellulose (MC, two polyacrylic acid derivatives (PA1 and PA2, and polyacrylate crosspolymer 11 (PC-11 were employed. The release rates of EH were examined and a comparison with the Weibull model equation was performed. Application of MC as the carrier in the hydrogel preparation resulted in fast release rate of EH, whereas in the case of the hydrogel composed with PC-11 the release was rather prolonged. Applied Weibull function adhered best to the experimental data. Due to the evaluated shape parameter β, in the Weibull equation, the systems under study released the active compound according to the Fickian diffusion.

  5. Weibull拟合的钠硫电池加热模块温升分析%Temperature rise analysis of heating module for sodium-sulfur battery based on Weibull fitting

    Institute of Scientific and Technical Information of China (English)

    张建平; 韩熠; 刘宇; 朱群志

    2015-01-01

    为分析钠硫电池加热模块的温升过程,分别基于三维瞬态导热方程和Weibull函数建立了加热模块的理论模型和试验温升数据的拟合模型,数值模拟了钠硫电池加热模块温升过程与瞬态温度分布,探讨Weibull参数对升温曲线的影响规律。结果表明:Weibull拟合模型能够精确描述加热模块的温升过程,可靠度较高;模块内部整体温升率随时间和距离模块中心的长度均呈非线性降低趋势;形状参数和尺度参数分别决定了分段温升和整体温升的效率,这为钠硫电池加热模块以及其他加热装置的优化设计提供参考。%In order to analyze the temperature rise of the heating module for sodium⁃sulfur battery, the theoretical model of the heating module and fitting model of the experimental temperature data were established on the basis of 3D transient heat conduction equation and Weibull function, respectively, and also the temperature rise process and the transient temperature distribution of heating module for sodium⁃sulfur battery were numerically simulated, and the effects of Weibull parameters on the temperature rise curve were further investigated. The results indicate that the Weibull fitting model could accurately describe the temperature rise process of heating module with high reliability, and the temperature rise rate inside the whole heating module presents nonlinearly decreasing trend with the increase of time, as well as the length from the module center. Furthermore, shape and scale parameter dominate the efficiency of the sectional temperature rise and the overall one respectively, and the technical reference is provided for the optimal design of heating module for sodium⁃sulfur battery and other heating devices.

  6. Statistical distribution of rainfall in Uttarakhand, India

    Science.gov (United States)

    Kumar, Vikram; Shanu; Jahangeer

    2017-07-01

    Understanding of rainfall is an important issue for Uttarakhand, India which having varied topography and due to that extreme rainfall causes quick runoff which warns structural and functional safety of large structures and other natural resources. In this study, an attempt has been made to determine the best-fit distribution of the annual series of rainfall data for the period of 1991-2002 of 13 districts of Uttarakhand. A best-fit distribution such as Chi-squared, Chi-squared (2P), exponential, exponential (2P), gamma, gamma (3P), gen. extreme value (GEV), log-Pearson 3, Weibull, Weibull (3P) distributions was applied. Comparisons of best distributions were based on the use of goodness-of-fit tests such as Kolmogorov-Smirnov, Anderson-Darling, and Chi squared. Results showed that the Weibull distribution performed the best with 46% of the total district, while the second best distribution was Chi squared (2P) and log-Pearson. The results of this study would be useful to the water resource engineers, policy makers and planners for the agricultural development and conservation of natural resources of Uttarakhand.

  7. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull Postharvest evaluation and estimate of shelf-life of fresh guava using the Weibull model

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with

  8. Reliability prediction of I&C cable insulation materials by DSC and Weibull theory for probabilistic safety assessment of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Santhosh, T.V., E-mail: santutv@barc.gov.in [Reactor Safety Division, Bhabha Atomic Research Centre (India); Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre (India); Ghosh, A.K. [Raja Ramanna Fellow, Department of Atomic Energy (India); Fernandes, B.G. [Department of Electrical Engineering, Indian Institute of Technology Bombay (India); Dubey, K.A. [Radiation Technology Development Division, Bhabha Atomic Research Centre (India)

    2016-01-15

    Highlights: • An approach for time dependent reliability prediction of I&C cable insulation materials for use in PSA of NPP has been developed based on OIT and OITp measurement, and Weibull theory. • OITs were determined from the measured OITp based on the fundamental thermodynamics principles, and the correlations obtained from DSC and FTIR are in good agreement with the EAB. • The SEM of thermal and irradiated samples of insulation materials was performed to support the degradation behaviour observed from OIT and EAB measurements. • The proposed methodology has been illustrated with the accelerated thermal and radiation ageing data on low voltage cables used in NPP for I&C applications. • The time dependent reliability predicted from the OIT based on Weibull theory will be useful in incorporating the cable ageing into PSA of NPP. - Abstract: Instrumentation and control (I&C) cables used in nuclear power plants (NPPs) are exposed to various deteriorative environmental effects during their operational lifetime. The factors consisting of long-term irradiation and enhanced temperature eventually result in insulation degradation. Monitoring of the actual state of the cable insulation and the prediction of their residual service life consist of the measurement of the properties that are directly proportional to the functionality of the cables (usually, elongation at break is used as the critical parameter). Although, several condition monitoring (CM) and life estimation techniques are available, currently there is no any standard methodology or an approach towards incorporating the cable ageing effects into probabilistic safety assessment (PSA) of NPPs. In view of this, accelerated thermal and radiation ageing of I&C cable insulation materials have been carried out and the degradation due to thermal and radiation ageing has been assessed using oxidation induction time (OIT) and oxidation induction temperature (OITp) measurements by differential scanning

  9. An eoq model for weibull deteriorating item with ramp type demand and salvage value under trade credit system

    Directory of Open Access Journals (Sweden)

    Lalit Mohan Pradhan

    2014-03-01

    Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.   

  10. WEIBULL MULTIPLICATIVE MODEL AND MACHINE LEARNING MODELS FOR FULL-AUTOMATIC DARK-SPOT DETECTION FROM SAR IMAGES

    Directory of Open Access Journals (Sweden)

    A. Taravat

    2013-09-01

    Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  11. Modelling diameter distributions of two-cohort forest stands with various proportions of dominant species: a two-component mixture model approach.

    Science.gov (United States)

    Rafal Podlaski; Francis Roesch

    2014-01-01

    In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components,...

  12. A bivariate limiting distribution of tumor latency time.

    Science.gov (United States)

    Rachev, S T; Wu, C; Yakovlev AYu

    1995-06-01

    The model of radiation carcinogenesis, proposed earlier by Klebanov, Rachev, and Yakovlev [8] substantiates the employment of limiting forms of the latent time distribution at high dose values. Such distributions arise within the random minima framework, the two-parameter Weibull distribution being a special case. This model, in its present form, does not allow for carcinogenesis at multiple sites. As shown in the present paper, a natural two-dimensional generalization of the model appears in the form of a Weibull-Marshall-Olkin distribution. Similarly, the study of a randomized version of the model based on the negative binomial minima scheme results in a bivariate Pareto-Marshall-Olkin distribution. In the latter case, an estimate for the rate of convergence to the limiting distribution is given.

  13. A meta-analysis of estimates of the AIDS incubation distribution.

    Science.gov (United States)

    Cooley, P C; Myers, L E; Hamill, D N

    1996-06-01

    Information from 12 studies is combined to estimate the AIDS incubation distribution with greater precision than is possible from a single study. The analysis uses a hierarchy of parametric models based on a four-parameter generalized F distribution. This general model contains four standard two-parameter distributions as special cases. The cases are the Weibull, gamma, log-logistic, lognormal distributions. These four special cases subsume three distinct asymptotic hazard behaviors. As time increases beyond the median of approximately 10 years, the hazard can increase to infinity (Weibull), can plateau at some constant level (gamma), or can decrease to zero (log-logistic and lognormal). The Weibull, gamma and 'log-logistic distributions' which represent the three distinct asymptotic hazard behaviors, all fit the data as well as the generalized F distribution at the 25 percent significance level. Hence, we conclude that incubation data is still too limited to ascertain the specific hazard assumption that should be utilized in studies of the AIDS epidemic. Accordingly, efforts to model the AIDS epidemic (e.g., back-calculation approaches) should allow the incubation distribution to take several forms to adequately represent HIV estimation uncertainty. It is recommended that, at a minimum, the specific Weibull, gamma and log-logistic distributions estimated in this meta-analysis should all be used in modeling the AIDS epidemic, to reflect this uncertainty.

  14. Flaw strength distributions and statistical parameters for ceramic fibers: The normal distribution

    Science.gov (United States)

    R'Mili, M.; Godin, N.; Lamon, J.

    2012-05-01

    The present paper investigates large sets of ceramic fibre failure strengths (500 to 1000 data) produced using tensile tests on tows that contained either 500 or 1000 filaments. The probability density function was determined through acoustic emission monitoring which allowed detection and counting of filament fractures. The statistical distribution of filament strengths was described using the normal distribution. The Weibull equation was then fitted to this normal distribution for estimation of statistical parameters. A perfect agreement between both distributions was obtained, and a quite negligible scatter in statistical parameters was observed, as opposed to the wide variability that is reported in the literature. Thus it was concluded that flaw strengths are distributed normally and that the statistical parameters that were derived are the true ones. In a second step, the conventional method of estimation of Weibull parameters was applied to these sets of data and, then, to subsets selected randomly. The influence of other factors involved in the conventional method of determination of statistical parameters is discussed. It is demonstrated that selection of specimens, sample size, and method of construction of so-called Weibull plots are responsible for statistical parameters variability.

  15. Statistical Analysis of a Weibull Extension with Bathtub-Shaped Failure Rate Function

    Directory of Open Access Journals (Sweden)

    Ronghua Wang

    2014-01-01

    Full Text Available We consider the parameter inference for a two-parameter life distribution with bathtub-shaped or increasing failure rate function. We present the point and interval estimations for the parameter of interest based on type-II censored samples. Through intensive Monte-Carlo simulations, we assess the performance of the proposed estimation methods by a comparison of precision. Example applications are demonstrated for the efficiency of the methods.

  16. GIS supported survey of potentials of wind power. Application of the Weibull distribution for site identification in the community Schesslitz, Bavaria; GIS-gestuetzte Erhebung von Windkraftpotenzialen. Anwendung der Weibull-Verteilung zur Standortfindung in der Gemeinde Schesslitz, Bayern

    Energy Technology Data Exchange (ETDEWEB)

    Nuehlen, Jochen

    2010-07-15

    By means of the legislative framework, the use of wind power is anchored in the economic and ecological development of Germany. The development of the utilization of wind energy on inland locations requires a detailed survey of potential with results as realistic as possible. As part of the Climate Alliance Bamberg (Federal Republic of Germany) a potential analysis of all renewable energies is to be created. Under this aspect, the author of the contribution under consideration reports on a GIS-based analysis of the wind power. An exemplary demonstration of the results is done at the community Schesslitz (Bavaria, Federal Republic of Germany). Two different methods for the analysis of the potential of wind energy are presented.

  17. Approximation of the breast height diameter distribution of two-cohort stands by mixture models III Kernel density estimators vs mixture models

    Science.gov (United States)

    Rafal Podlaski; Francis A. Roesch

    2014-01-01

    Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...

  18. Approximation of the breast height diameter distribution of two-cohort stands by mixture models II Goodness-of-fit tests

    Science.gov (United States)

    Rafal Podlaski; Francis .A. Roesch

    2013-01-01

    The goals of this study are (1) to analyse the accuracy of the approximation of empirical distributions of diameter at breast height (dbh) using two-component mixtures of either the Weibull distribution or the gamma distribution in two−cohort stands, and (2) to discuss the procedure of choosing goodness−of−fit tests. The study plots were...

  19. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  20. Adaptive Weibull Multiplicative Model and Multilayer Perceptron Neural Networks for Dark-Spot Detection from SAR Imagery

    Directory of Open Access Journals (Sweden)

    Alireza Taravat

    2014-12-01

    Full Text Available Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR, as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM and MultiLayer Perceptron (MLP neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN model generates poor accuracies.

  1. Weibull modulus of hardness, bend strength, and tensile strength of Ni−Ta−Co−X metallic glass ribbons

    Energy Technology Data Exchange (ETDEWEB)

    Neilson, Henry J., E-mail: hjn2@case.edu [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States); Petersen, Alex S.; Cheung, Andrew M.; Poon, S. Joseph; Shiflet, Gary J. [University of Virginia, 395 McCormick Road, P.O. Box 400745, Charlottesville, VA 22904 (United States); Widom, Mike [Carnegie Mellon University, 5000 Forbes Avenue, Wean Hall 3325, Pittsburgh, PA 15213 (United States); Lewandowski, John J. [Case Western Reserve University, 10900 Euclid Ave, Cleveland, OH (United States)

    2015-05-14

    In this study, the variations in mechanical properties of Ni−Co−Ta-based metallic glasses have been analyzed. Three different chemistries of metallic glass ribbons were analyzed: Ni{sub 45}Ta{sub 35}Co{sub 20}, Ni{sub 40}Ta{sub 35}Co{sub 20}Nb{sub 5}, and Ni{sub 30}Ta{sub 35}Co{sub 30}Nb{sub 5}. These alloys possess very high density (approximately 12.5 g/cm{sup 3}) and very high strength (e.g. >3 GPa). Differential scanning calorimetry (DSC) and x-ray diffraction (XRD) were used to characterize the amorphicity of the ribbons. Mechanical properties were measured via a combination of Vickers hardness, bending strength, and tensile strength for each chemistry. At least 50 tests were conducted for each chemistry and each test technique in order to quantify the variability of properties using both 2- and 3-parameter Weibull statistics. The variability in properties and their source(s) were compared to that of other engineering materials, while the nature of deformation via shear bands as well as fracture surface features have been determined using scanning electron microscopy (SEM). Toughness, the role of defects, and volume effects are also discussed.

  2. Compositional Analyses and Shelf-Life Modeling of Njangsa (Ricinodendron heudelotii) Seed Oil Using the Weibull Hazard Analysis.

    Science.gov (United States)

    Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E

    2017-08-01

    This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.

  3. Flexural and bond strengths of relined denture polymers assessed by four-point bending tests and Weibull analysis.

    Science.gov (United States)

    Polyzois, Gregory L; Lagouvardos, Panagiotis E; Frangou, Maria J

    2012-06-01

    The aim of this study was to (1) investigate the flexural strengths of three denture resins i.e. heat, photopolymerised and microwaved and how it was affected by relining with auto- and visible light-polymerised hard reliners, (2) investigate the bond strengths between denture resins and hard reliners and (3) interpret the results of both tests by utilising Weibull analysis. Specimens (65 × 10 × 2.5 mm) from denture resins, relined and bonded combinations were tested using a four-point bending test in a universal testing machine and a crosshead speed of 5 mm/min. Ten specimens for each bulk resin and denture resin-reliner combination for a total of 150 were tested. Statistical analysis indicated significant differences between bulk materials (p < 0.001) and between reliners (p < 0.001) for flexural and bond strength tests. was concluded that (1) the four-point flexural strength was different between the denture base materials, (2) flexure strength between bulk and relined or between relined with autopolymerised and photopolymerised bases was different, (3) flexural strength among relined denture bases was different and (4) bond strengths among relined denture bases were different. © 2011 The Gerodontology Society and John Wiley & Sons A/S.

  4. Two-parameter logistic and Weibull equations provide better fits to survival data from isogenic populations of Caenorhabditis elegans in axenic culture than does the Gompertz model.

    Science.gov (United States)

    Vanfleteren, J R; De Vreese, A; Braeckman, B P

    1998-11-01

    We have fitted Gompertz, Weibull, and two- and three-parameter logistic equations to survival data obtained from 77 cohorts of Caenorhabditis elegans in axenic culture. Statistical analysis showed that the fitting ability was in the order: three-parameter logistic > two-parameter logistic = Weibull > Gompertz. Pooled data were better fit by the logistic equations, which tended to perform equally well as population size increased, suggesting that the third parameter is likely to be biologically irrelevant. Considering restraints imposed by the small population sizes used, we simply conclude that the two-parameter logistic and Weibull mortality models for axenically grown C. elegans generally provided good fits to the data, whereas the Gompertz model was inappropriate in many cases. The survival curves of several short- and long-lived mutant strains could be predicted by adjusting only the logistic curve parameter that defines mean life span. We conclude that life expectancy is genetically determined; the life span-altering mutations reported in this study define a novel mean life span, but do not appear to fundamentally alter the aging process.

  5. Recurrent frequency-size distribution of characteristic events

    Directory of Open Access Journals (Sweden)

    S. G. Abaimov

    2009-04-01

    Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.

  6. Impact of Blending on Strength Distribution of Ambient Cured Metakaolin and Palm Oil Fuel Ash Based Geopolymer Mortar

    Directory of Open Access Journals (Sweden)

    Taliat Ola Yusuf

    2014-01-01

    Full Text Available This paper investigates the influence of blending of metakaolin with silica rich palm oil fuel ash (POFA on the strength distribution of geopolymer mortar. The broadness of strength distribution of quasi-brittle to brittle materials depends strongly on the existence of flaws such as voids, microcracks, and impurities in the material. Blending of materials containing alumina and silica with the objective of improving the performance of geopolymer makes comprehensive characterization necessary. The Weibull distribution is used to study the strength distribution and the reliability of geopolymer mortar specimens prepared from 100% metakaolin, 50% and 70% palm and cured under ambient condition. Mortar prisms and cubes were used to test the materials in flexure and compression, respectively, at 28 days and the results were analyzed using Weibull distribution. In flexure, Weibull modulus increased with POFA replacement, indicating reduced broadness of strength distribution from an increased homogeneity of the material. Modulus, however, decreased with increase in replacement of POFA in the specimens tested under compression. It is concluded that Weibull distribution is suitable for analyses of the blended geopolymer system. While porous microstructure is mainly responsible for flexural failure, heterogeneity of reaction relics is responsible for the compression failure.

  7. INPUT MODELLING USING STATISTICAL DISTRIBUTIONS AND ARENA SOFTWARE

    Directory of Open Access Journals (Sweden)

    Elena Iuliana GINGU (BOTEANU

    2015-05-01

    Full Text Available The paper presents a method of choosing properly the probability distributions for failure time in a flexible manufacturing system. Several well-known distributions often provide good approximation in practice. The commonly used continuous distributions are: Uniform, Triangular, Beta, Normal, Lognormal, Weibull, and Exponential. In this article is studied how to use the Input Analyzer in the simulation language Arena to fit probability distributions to data, or to evaluate how well a particular distribution. The objective was to provide the selection of the most appropriate statistical distributions and to estimate parameter values of failure times for each machine of a real manufacturing line.

  8. A mixture of exponentials distribution for a simple and precise assessment of the volcanic hazard

    Directory of Open Access Journals (Sweden)

    A. T. Mendoza-Rosas

    2009-03-01

    Full Text Available The assessment of volcanic hazard is the first step for disaster mitigation. The distribution of repose periods between eruptions provides important information about the probability of new eruptions occurring within given time intervals. The quality of the probability estimate, i.e., of the hazard assessment, depends on the capacity of the chosen statistical model to describe the actual distribution of the repose times. In this work, we use a mixture of exponentials distribution, namely the sum of exponential distributions characterized by the different eruption occurrence rates that may be recognized inspecting the cumulative number of eruptions with time in specific VEI (Volcanic Explosivity Index categories. The most striking property of an exponential mixture density is that the shape of the density function is flexible in a way similar to the frequently used Weibull distribution, matching long-tailed distributions and allowing clustering and time dependence of the eruption sequence, with distribution parameters that can be readily obtained from the observed occurrence rates. Thus, the mixture of exponentials turns out to be more precise and much easier to apply than the Weibull distribution. We recommended the use of a mixture of exponentials distribution when regimes with well-defined eruption rates can be identified in the cumulative series of events. As an example, we apply the mixture of exponential distributions to the repose-time sequences between explosive eruptions of the Colima and Popocatépetl volcanoes, México, and compare the results obtained with the Weibull and other distributions.

  9. The Probability Density Functions to Diameter Distributions for Scots Pine Oriental Beech and Mixed Stands

    Directory of Open Access Journals (Sweden)

    Aydın Kahriman

    2011-11-01

    Full Text Available Determine the diameter distribution of a stand and its relations with stand ages, site index, density and mixture percentage is very important both biologically and economically. The Weibull with two parameters, Weibull with three parameters, Gamma with two parameters, Gamma with three parameters, Beta, Lognormal with two parameters, Lognormal with three parameters, Normal, Johnson SB probability density functions were used to determination of diameter distributions. This study aimed to compared based on performance of describing different diameter distribution and to describe the best successful function of diameter distributions. The data were obtaited from 162 temporary sample plots measured Scots pine and Oriental beech mixed stands in Black Sea Region. The results show that four parameter Johnson SB function for both scots pine and oriental beech is the best successful function to describe diameter distributions based on error index values calculated by difference between observed and predicted diameter distributions.

  10. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    as a material constant. However, the estimation of this parameter is laborious and subject to estimation problems. It is shown that the Weibull modulus is inherently connected to the coefficient of variation and that the information obtained from the modulus is unclear. The distribution of crushing strength...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated. Copyright © 2002 Elsevier Science B.V....

  11. Can Data Recognize Its Parent Distribution?

    Energy Technology Data Exchange (ETDEWEB)

    A.W.Marshall; J.C.Meza; and I. Olkin

    1999-05-01

    This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

  12. Modeling diameter distribution of the broadleaved-Korean pine mixed forest on Changbai Mountains of China

    Institute of Scientific and Technical Information of China (English)

    WANG Shunzhong; DAI Limin; LIU Guohua; YUAN Jianqiong; ZHANG Hengmin; WANG Qingli

    2006-01-01

    The broadleaved-Korean pine mixed forest is a native vegetation in the Changbai Mountains, northeast China. The probability density functions including the normal, negative exponential, Weibull and finite mixture distribution, were used to describe the diameter distributions of the species groups and entire forest stand. There is a strong correlation between parameters and mean DBH except the shape parameters in the mixture distribution. The diameter classes of species and entire forest stand showed not negative exponential but normal and "S" distribution. The mixture function was better than normal and Weibull to describe the model distribution. The location parameter had an effect on the estimated frequency in the first diameter class, when the estimated location parameter was bigger than the lower limit of the first diameter class.

  13. Aplicación del modelo de Weibull para la inactivación por calor de la Pseudomonas aeruginosa a diferentes temperaturas

    Directory of Open Access Journals (Sweden)

    Gloria Bueno-García

    2011-01-01

    Full Text Available Se demuestra bajo las condiciones experimentales establecidas que las curvas de sobrevivencia de la Pseudomonas aeruginosa siguió una cinética no lineal mostrando una caída inicial rápida en el conteo celular seguida por una cola causada por una disminución de la velocidad de inactivación, El modelo de Weibull describió con precisión la cinética de inactivación. Se estimaron los parámetros estadísticos que mejor explican la frecuencia observada: media, varianza y coeficiente de asimetría. Para la Pseudomonas aeruginosa el valor b depende de la temperatura y el valor n es independiente. El modelo de distribución de Weibull fue capaz de predecir el tiempo de calentamiento para inactivar ocho ciclos log10 y para estimar el tiempo de calentamiento equivalente para la misma proporción de P. aeruginosa sobreviviente a otras temperaturas.

  14. Empleo de la función Weibull para evaluar la emergencia de las plántulas de Albizia lebbeck (L. Benth

    Directory of Open Access Journals (Sweden)

    Marlen Navarro

    Full Text Available Con el objetivo de conocer el vigor de las semillas de Albizia lebbeck mediante la evaluación de la emergencia de plántulas, a través de la función Weibull modificada, se realizó la siembra en tres condiciones ambientales y en diferentes tiempos de almacenamiento de la semilla. El diseño fue completamente aleatorizado, con arreglo factorial. Se realizó análisis de varianza para los parámetros M (emergencia acumulada máxima, k (tasa de emergencia y Z (retraso para el inicio de la emergencia de la función Weibull modificada. A partir de los seis meses de iniciado el almacenamiento (44,1 % se observó la pérdida brusca del porcentaje de M en el vivero (A y ligeras variaciones en la cabina (C, en comparación con A y B (sombreador. El ámbito de dispersión del parámetro k osciló entre 0,4-2,6; 0,29-1,9 y 0,5-1,4 % emergencia d-1 para las evaluaciones realizadas en A, B y C, respectivamente. Del análisis de Z se interpretó que el tiempo para el inicio de la emergencia, sin distinción del ambiente de siembra, estuvo enmarcado entre los 3,0 y 7,3 días posteriores a la siembra. En el vivero a pleno sol, en la evaluación a 6 mdia (meses de iniciado el almacenamiento, se obtuvieron los mejores resultados de los parámetros biológicos de la ecuación de Weibull, lo cual permitió un análisis global que indicó un grado de vigor alto en las semillas de A. lebbeck, en comparación con las restantes evaluaciones

  15. Do Insect Populations Die at Constant Rates as They Become Older? Contrasting Demographic Failure Kinetics with Respect to Temperature According to the Weibull Model.

    Directory of Open Access Journals (Sweden)

    Petros Damos

    Full Text Available Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off, but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard

  16. Life Cycle Management for Components in Nuclear Power Plant Based on Weibull Process%基于威布尔过程的核电厂设备寿期管理决策

    Institute of Scientific and Technical Information of China (English)

    王大林; 赵博

    2013-01-01

    Equipment life cycle management, including mean replacement time and life reliability, was determined based on distribution fitting of historical life data for most nuclear power plants. This method is used widely but it ignores the influence on facilities life of the repair and operating time. To solve this problem, based on the specialties of components in nuclear power plant, a new approach of data processing and life deciding was proposed with empirical method and failure rate indicator combined by Weibull process. The calculation of actual example proves that this new approach overcomes defects such as waste of information and risk of assumption error.%核电厂目前多通过对历史更换数据统计拟合获得设备寿命分布,以确定设备的平均寿命和可靠寿命,而忽视了故障维修和累积运行时间对设备寿命的影响.为解决该问题,根据核电厂设备现场运行特点,对威布尔过程的拟合方法及在现场数据处理中的应用进行分析,提出了基于威布尔过程将专家经验寿命处理为失效率指标的寿期决策方法,克服了原方法对现场数据包含信息使用不完全的缺点,并进行了实例计算.

  17. A study of inflation effects on an EOQ model for Weibull deteriorating/ameliorating items with ramp type of demand and shortages

    Directory of Open Access Journals (Sweden)

    Valliathal M.

    2013-01-01

    Full Text Available This paper deals with the effects of inflation and time discounting on an inventory model with general ramp type demand rate, time dependent (Weibull deterioration rate and partial backlogging of unsatisfied demand. The model is studied under the replenishment policy, starting with shortages under two different types of backlogging rates, and their comparative study is also provided. We then use the computer software, MATLto find the optimal replenishment policies. Duration of positive inventory level is taken as the decision variable to minimize the total cost of the proposed system. Numerical examples are then taken to illustrate the solution procedure. Finally, sensitivity of the optimal solution to changes of the values of different system parameters is also studied.

  18. Evaluation of the reliability of Si3N4-Al2O3 -CTR2O3 ceramics through Weibull analysis

    Directory of Open Access Journals (Sweden)

    Santos Claudinei dos

    2003-01-01

    Full Text Available The objective of this work has been to compare the reliability of two Si3N4 ceramics, with Y2O3/Al2O3 or CTR2O3/Al2O3 mixtures as additives, in regard to their 4-point bending strength and to confirm the potential of the rare earth oxide mixture, CTR2O3, produced at FAENQUIL, as an alternative, low cost sinter additive for pure Y2O3 in the sintering of Si3N4 ceramics. The oxide mixture CTR2O3 is a solid solution formed mainly by Y2O3, Er2O3, Yb2O3 and Dy2O3 with other minor constituents and is obtained at a cost of only 20% of pure Y2O3. Samples were sintered by a gas pressure sintering process at 1900 °C under a nitrogen pressure of 1.5 MPa and an isothermal holding time of 2 h. The obtained materials were characterized by their relative density, phase composition and bending strength. The Weibull analysis was used to describe the reliability of these materials. Both materials produced presented relative densities higher than 99.5%t.d., b-Si3N4 and Y3Al5O12 (YAG as cristalline phases and bending strengths higher than 650 MPa, thus demonstrating similar behaviors regarding their physical, chemical and mechanical characteristics. The statistical analysis of their strength also showed similar results for both materials, with Weibull moduli m of about 15 and characteristic stress values s o of about 700 MPa. These results confirmed the possibility of using the rare earth oxide mixture, CTR2O3, as sinter additive for high performance Si3N4 ceramics, without prejudice of the mechanical properties when compared to Si3N4 ceramics sintered with pure Y2O3.

  19. Long-Term Profiles of Wind and Weibull Distribution Parameters up to 600 m in a Rural Coastal and an Inland Suburban Area

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph

    2014-01-01

    by the root-mean-square error was about 10 % lower for the analysis compared to the forecast simulations. At the rural coastal site, the observed mean wind speeds above 60 m were underestimated by both the analysis and forecast model runs. For the inland suburban area, the mean wind speed is overestimated...... at a flat rural coastal site in western Denmark and at an inland suburban area near Hamburg in Germany. Simulations with the weather research and forecasting numerical model were carried out in both forecast and analysis configurations. The scatter between measured and modelled wind speeds expressed...

  20. A Study on the Effect of Nudging on Long-Term Boundary Layer Profiles of Wind and Weibull Distribution Parameters in a Rural Coastal Area

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier

    2013-01-01

    the scatter between the simulated and measured wind speeds, expressed by the root-mean-square error, by about 20% between altitudes of 100 and 500 m. The root-mean-square error was nearly constant with height for the nudged case (~2.2 m s−1) and slightly increased with height for the nonnudged one, reaching 2...

  1. Weibull Wind-Speed Distribution Parameters Derived from a Combination of Wind-Lidar and Tall-Mast Measurements Over Land, Coastal and Marine Sites

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Floors, Rogier Ralph; Peña, Alfredo

    2016-01-01

    Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is ex...

  2. A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil

    Science.gov (United States)

    Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo

    2014-01-01

    This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909

  3. A model based on environmental factors for diameter distribution in black wattle in Brazil.

    Science.gov (United States)

    Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo

    2014-01-01

    This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period.

  4. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  5. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    García Mogollón Carlos

    2010-09-01

    Full Text Available

    La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un diseño completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  6. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  7. Polynomial probability distribution estimation using the method of moments.

    Science.gov (United States)

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  8. Handbook of exponential and related distributions for engineers and scientists

    CERN Document Server

    Pal, Nabendu; Lim, Wooi K

    2005-01-01

    The normal distribution is widely known and used by scientists and engineers. However, there are many cases when the normal distribution is not appropriate, due to the data being skewed. Rather than leaving you to search through journal articles, advanced theoretical monographs, or introductory texts for alternative distributions, the Handbook of Exponential and Related Distributions for Engineers and Scientists provides a concise, carefully selected presentation of the properties and principles of selected distributions that are most useful for application in the sciences and engineering.The book begins with all the basic mathematical and statistical background necessary to select the correct distribution to model real-world data sets. This includes inference, decision theory, and computational aspects including the popular Bootstrap method. The authors then examine four skewed distributions in detail: exponential, gamma, Weibull, and extreme value. For each one, they discuss general properties and applicabi...

  9. A New Insight into the Earthquake Recurrence Studies from the Three-parameter Generalized Exponential Distributions

    Science.gov (United States)

    Pasari, S.; Kundu, D.; Dikshit, O.

    2012-12-01

    Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Exponential, gamma, Weibull and lognormal distributions are quite established probability models in this recurrence interval estimation. However, they have certain shortcomings too. Thus, it is imperative to search for some alternative sophisticated distributions. In this paper, we introduce a three-parameter (location, scale and shape) exponentiated exponential distribution and investigate the scope of this distribution as an alternative of the afore-mentioned distributions in earthquake recurrence studies. This distribution is a particular member of the exponentiated Weibull distribution. Despite of its complicated form, it is widely accepted in medical and biological applications. Furthermore, it shares many physical properties with gamma and Weibull family. Unlike gamma distribution, the hazard function of generalized exponential distribution can be easily computed even if the shape parameter is not an integer. To contemplate the plausibility of this model, a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (20-32 deg N and 87-100 deg E) has been used. The model parameters are estimated using maximum likelihood estimator (MLE) and method of moment estimator (MOME). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.

  10. Some properties of generalized gamma distribution

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-03-01

    Full Text Available In this paper, the generalized gamma (GG distribution that is a flexible distribution in statistical literature, and has exponential, gamma, and Weibull as subfamilies, and lognormal as a limiting distribution is introduced. The power and logarithmic moments of this family is defined. A new moment estimation method of parameters of GG family using it's characterization is presented, this method is compared with MLE method in gamma subfamily for small and large sample size. Here we study GG entropy representation and its estimation. In addition Kullback-Leibler discrimination , Akaike and Bayesian information criterion is discussed. In brief, this paper consist of presentation of general review of important properties in GG family.

  11. Análise da resistência de vigas de mármore sintético através da distribuição estatística de weibull

    OpenAIRE

    Rabahi, Ricardo Fouad

    2011-01-01

    No presente trabalho foi abordada a análise estatística de Weibull como ferramenta de avaliação da resistência mecânica à flexão tanto do mármore sintético puro quanto do mármore sintético reforçado com fibra vidro. Um dos objetivos do trabalho é observar o comportamento do módulo de Weibull e da resistência mecânica, à medida que se introduz fibra de vidro picotada na composição do mármore sintético. Sua influência na resistência mecânica será investigada e levar-se-á em consideração os ganh...

  12. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Senkpeil, Ryan R. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Tlatov, Andrey G. [Kislovodsk Mountain Astronomical Station of the Pulkovo Observatory, Kislovodsk 357700 (Russian Federation); Nagovitsyn, Yury A. [Pulkovo Astronomical Observatory, Russian Academy of Sciences, St. Petersburg 196140 (Russian Federation); Pevtsov, Alexei A. [National Solar Observatory, Sunspot, NM 88349 (United States); Chapman, Gary A.; Cookson, Angela M. [San Fernando Observatory, Department of Physics and Astronomy, California State University Northridge, Northridge, CA 91330 (United States); Yeates, Anthony R. [Department of Mathematical Sciences, Durham University, South Road, Durham DH1 3LE (United Kingdom); Watson, Fraser T. [National Solar Observatory, Tucson, AZ 85719 (United States); Balmaceda, Laura A. [Institute for Astronomical, Terrestrial and Space Sciences (ICATE-CONICET), San Juan (Argentina); DeLuca, Edward E. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Martens, Petrus C. H., E-mail: munoz@solar.physics.montana.edu [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30303 (United States)

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  13. Evaluations of the apparent activation energy distribution function for the non-isothermal reduction of nickel oxide nano-powders

    Directory of Open Access Journals (Sweden)

    Bojan Ž. Janković

    2014-02-01

    Full Text Available The differential conversion curves of the non-isothermal NiO reduction process by hydrogen are fitted by the Weibull (non-isothermal probability density function (Wpdf, in a wide range of the degree of conversion (α = 0.06 - 0.96. It was established that the Weibull distribution parameters (β and η show the different dependences on the heating rate of the system (vh (shape parameter (β – linear form and scale parameter (η – exponential form. Model independent values of the apparent activation energy were calculated using the Friedman’s isoconversional method. It was found that the calculated apparent activation energy depends on the degree of conversion, α, and it shows a constant value in the range of the degree of conversion 0.20 ≤ α ≤ 0.60 (Ea = 90.8 kJ mol-1. Knowing the Weibull distribution function of reaction times, it is possible to determine the density distribution function of apparent activation energies at different heating rates. It was established that the changes of the symmetry of density distribution functions may be an indication for deviations from the simple crystallization kinetics as expressed by the Johnson-Mehl-Avrami (JMA model, and this behaviour is probably due to a more complex transformation process such as a process described by the two-parameter autocatalytic Šesták-Berggren model. 

  14. Probability density functions for description of diameter distribution in thinned stands of Tectona grandis

    Directory of Open Access Journals (Sweden)

    Julianne de Castro Oliveira

    2012-06-01

    Full Text Available The objective of this study was to evaluate the effectiveness of fatigue life, Frechet, Gamma, Generalized Gamma, Generalized Logistic, Log-logistic, Nakagami, Beta, Burr, Dagum, Weibull and Hyperbolic distributions in describing diameter distribution in teak stands subjected to thinning at different ages. Data used in this study originated from 238 rectangular permanent plots 490 m2 in size, installed in stands of Tectona grandis L. f. in Mato Grosso state, Brazil. The plots were measured at ages 34, 43, 55, 68, 81, 82, 92, 104, 105, 120, 134 and 145 months on average. Thinning was done in two occasions: the first was systematic at age 81months, with a basal area intensity of 36%, while the second was selective at age 104 months on average and removed poorer trees, reducing basal area by 30%. Fittings were assessed by the Kolmogorov-Smirnov goodness-of-fit test. The Log-logistic (3P, Burr (3P, Hyperbolic (3P, Burr (4P, Weibull (3P, Hyperbolic (2P, Fatigue Life (3P and Nakagami functions provided more satisfactory values for the k-s test than the more commonly used Weibull function.

  15. The New Kumaraswamy Kumaraswamy Family of Generalized Distributions with Application

    Directory of Open Access Journals (Sweden)

    Mohamed Ali Ahmed

    2015-08-01

    Full Text Available Finding the best fitted distribution for data set becomes practically an important problem in world of data sets so that it is useful to use new families of distributions to fit more cases or get better fits than before.  In this paper, a new generating family of generalized distributions so called the Kumaraswamy - Kumaraswamy (KW-KW family is presented. Four important common families of distributions are illustrated as special cases from the KW KW family. Moments, probability weighted moments, moment generating function, quantile function, median, mean deviation, order statistics and moments of order statistics are obtained. Parameters estimation and variance covariance matrix are computed using maximum likelihood method. A real data set is used to illustrate the potentiality of the KW KW weibull distribution (which derived from the kw kw family compared with other distributions.

  16. The stochastic distribution of available coefficient of friction on quarry tiles for human locomotion.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.

  17. Nonlinear Fitting Method of Long-Term Distributions for Statistical Analysis of Extreme Negative Surge Elevations

    Institute of Scientific and Technical Information of China (English)

    DONG Sheng; LI Fengli; JIAO Guiying

    2003-01-01

    Hydrologic frequency analysis plays an important role in coastal and ocean engineering for structural design and disaster prevention in coastal areas. This paper proposes a Nonlinear Least Squares Method (NLSM), which estimates the three unknown parameters of the Weibull distribution simultaneously by an iteration method. Statistical test shows that the NLSM fits each data sample well. The effects of different parameter-fitting methods, distribution models, and threshold values are also discussed in the statistical analysis of storm set-down elevation. The best-fitting probability distribution is given and the corresponding return values are estimated for engineering design.

  18. Optimal power flow for distribution networks with distributed generation

    Directory of Open Access Journals (Sweden)

    Radosavljević Jordan

    2015-01-01

    Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046

  19. Stand diameter distribution modelling and prediction based on Richards function.

    Directory of Open Access Journals (Sweden)

    Ai-guo Duan

    Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  20. Stand diameter distribution modelling and prediction based on Richards function.

    Science.gov (United States)

    Duan, Ai-guo; Zhang, Jian-guo; Zhang, Xiong-qing; He, Cai-yun

    2013-01-01

    The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata) plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM) or maximum likelihood estimates method (MLEM) were applied to estimate the parameters of models, and the parameter prediction method (PPM) and parameter recovery method (PRM) were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1) R distribution presented a more accurate simulation than three-parametric Weibull function; (2) the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3) the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4) the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  1. Raindrop size distribution: Fitting performance of common theoretical models

    Science.gov (United States)

    Adirosi, E.; Volpi, E.; Lombardo, F.; Baldini, L.

    2016-10-01

    Modelling raindrop size distribution (DSD) is a fundamental issue to connect remote sensing observations with reliable precipitation products for hydrological applications. To date, various standard probability distributions have been proposed to build DSD models. Relevant questions to ask indeed are how often and how good such models fit empirical data, given that the advances in both data availability and technology used to estimate DSDs have allowed many of the deficiencies of early analyses to be mitigated. Therefore, we present a comprehensive follow-up of a previous study on the comparison of statistical fitting of three common DSD models against 2D-Video Distrometer (2DVD) data, which are unique in that the size of individual drops is determined accurately. By maximum likelihood method, we fit models based on lognormal, gamma and Weibull distributions to more than 42.000 1-minute drop-by-drop data taken from the field campaigns of the NASA Ground Validation program of the Global Precipitation Measurement (GPM) mission. In order to check the adequacy between the models and the measured data, we investigate the goodness of fit of each distribution using the Kolmogorov-Smirnov test. Then, we apply a specific model selection technique to evaluate the relative quality of each model. Results show that the gamma distribution has the lowest KS rejection rate, while the Weibull distribution is the most frequently rejected. Ranking for each minute the statistical models that pass the KS test, it can be argued that the probability distributions whose tails are exponentially bounded, i.e. light-tailed distributions, seem to be adequate to model the natural variability of DSDs. However, in line with our previous study, we also found that frequency distributions of empirical DSDs could be heavy-tailed in a number of cases, which may result in severe uncertainty in estimating statistical moments and bulk variables.

  2. Spatial Patterns of Wind Speed Distributions in Switzerland

    CERN Document Server

    Laib, Mohamed

    2016-01-01

    This paper presents an initial exploration of high frequency records of extreme wind speed in two steps. The first consists in finding the suitable extreme distribution for $120$ measuring stations in Switzerland, by comparing three known distributions: Weibull, Gamma, and Generalized extreme value. This comparison serves as a basis for the second step which applies a spatial modelling by using Extreme Learning Machine. The aim is to model distribution parameters by employing a high dimensional input space of topographical information. The knowledge of probability distribution gives a comprehensive information and a global overview of wind phenomena. Through this study, a flexible and a simple modelling approach is presented, which can be generalized to almost extreme environmental data for risk assessment and to model renewable energy.

  3. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  4. Background Noise Distribution before and afterHigh-Resolution Processing in Ship-borne Radar

    Institute of Scientific and Technical Information of China (English)

    ZHANGZhong

    2005-01-01

    When high-resolution algorithm is applied in ship-borne radar~ high-resolution algorithm's nonlinearity and distributional characteristics before highresolution processing determine background clutter's distributional characteristics after high-resolution and detector design afterwards. Because background noise before high-resolution has physical significance, the statistical model of first-order Bragg lines and second order components of sea clutter is put forward. Then by using higher-order cumulative quantity's statistical verification of actually measured data, it is concluded that background noise before high-resolution conforms to normal distribution in ship-borne radar. The non-linearity of high-resolution algorithm determines that background noise after high-resolution processing conforms to non-normal distribution. Non-normal distributed clutter mainly include Weibull, Lognormal and K clutter. Rayleigh clutter can be seen as special case of Weibull clutter. These clutter have differently statistical characteristics and can be discriminated by clutter characteristics recognition. The numerical domain's distribution after high-resolution processing is determined by improved minimum entropy clutter characteristics recognition method based on rule AIC, namely two-parameter domain scanning method. This identification method has higher recognition rate. It is verified that background noise after high-resolution by pre-whitenedMUSIC conforms to lognormal distribution.

  5. A multiplicative process for generating the rank-order distribution of UK election results

    CERN Document Server

    Fenner, Trevor; Loizou, George

    2016-01-01

    Human dynamics and sociophysics suggest statistical models that may explain and provide us with a better understanding of social phenomena. Here we propose a generative multiplicative decrease model that gives rise to a rank-order distribution and allows us to analyse the results of the last three UK parliamentary elections. We provide empirical evidence that the additive Weibull distribution, which can be generated from our model, is a close fit to the electoral data, offering a novel interpretation of the recent election results.

  6. Modeling of speed distribution for mixed bicycle traffic flow

    Directory of Open Access Journals (Sweden)

    Cheng Xu

    2015-11-01

    Full Text Available Speed is a fundamental measure of traffic performance for highway systems. There were lots of results for the speed characteristics of motorized vehicles. In this article, we studied the speed distribution for mixed bicycle traffic which was ignored in the past. Field speed data were collected from Hangzhou, China, under different survey sites, traffic conditions, and percentages of electric bicycle. The statistics results of field data show that the total mean speed of electric bicycles is 17.09 km/h, 3.63 km/h faster and 27.0% higher than that of regular bicycles. Normal, log-normal, gamma, and Weibull distribution models were used for testing speed data. The results of goodness-of-fit hypothesis tests imply that the log-normal and Weibull model can fit the field data very well. Then, the relationships between mean speed and electric bicycle proportions were proposed using linear regression models, and the mean speed for purely electric bicycles or regular bicycles can be obtained. The findings of this article will provide effective help for the safety and traffic management of mixed bicycle traffic.

  7. Generalized Extreme Value Distribution Models for the Assessment of Seasonal Wind Energy Potential of Debuncha, Cameroon

    Directory of Open Access Journals (Sweden)

    Nkongho Ayuketang Arreyndip

    2016-01-01

    Full Text Available The method of generalized extreme value family of distributions (Weibull, Gumbel, and Frechet is employed for the first time to assess the wind energy potential of Debuncha, South-West Cameroon, and to study the variation of energy over the seasons on this site. The 29-year (1983–2013 average daily wind speed data over Debuncha due to missing values in the years 1992 and 1994 is gotten from NASA satellite data through the RETScreen software tool provided by CANMET Canada. The data is partitioned into min-monthly, mean-monthly, and max-monthly data and fitted using maximum likelihood method to the two-parameter Weibull, Gumbel, and Frechet distributions for the purpose of determining the best fit to be used for assessing the wind energy potential on this site. The respective shape and scale parameters are estimated. By making use of the P values of the Kolmogorov-Smirnov statistic (K-S and the standard error (s.e analysis, the results show that the Frechet distribution best fits the min-monthly, mean-monthly, and max-monthly data compared to the Weibull and Gumbel distributions. Wind speed distributions and wind power densities of both the wet and dry seasons are compared. The results show that the wind power density of the wet season was higher than in the dry season. The wind speeds at this site seem quite low; maximum wind speeds are listed as between 3.1 and 4.2 m/s, which is below the cut-in wind speed of many modern turbines (6–10 m/s. However, we recommend the installation of low cut-in wind turbines like the Savonius or Aircon (10 KW for stand-alone low energy need.

  8. Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution%Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution

    Institute of Scientific and Technical Information of China (English)

    SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan

    2011-01-01

    As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.

  9. Distribution of Earthquake Interevent Times in Northeast India and Adjoining Regions

    Science.gov (United States)

    Pasari, Sumanta; Dikshit, Onkar

    2015-10-01

    This study analyzes earthquake interoccurrence times of northeast India and its vicinity from eleven probability distributions, namely exponential, Frechet, gamma, generalized exponential, inverse Gaussian, Levy, lognormal, Maxwell, Pareto, Rayleigh, and Weibull distributions. Parameters of these distributions are estimated from the method of maximum likelihood estimation, and their respective asymptotic variances as well as confidence bounds are calculated using Fisher information matrices. Three model selection criteria namely the Chi-square criterion, the maximum likelihood criterion, and the Kolmogorov-Smirnov minimum distance criterion are used to compare model suitability for the present earthquake catalog (Y adav et al. in Pure Appl Geophys 167:1331-1342, 2010). It is observed that gamma, generalized exponential, and Weibull distributions provide the best fitting, while exponential, Frechet, inverse Gaussian, and lognormal distributions provide intermediate fitting, and the rest, namely Levy, Maxwell Pareto, and Rayleigh distributions fit poorly to the present data. The conditional probabilities for a future earthquake and related conditional probability curves are presented towards the end of this article.

  10. 威布尔分布对整体旋转式斯特林制冷机的可靠性计算%Calculation on the reliability of rotary Stirling cryocoolers by Weibull Law

    Institute of Scientific and Technical Information of China (English)

    罗高乔; 范仙红; 何世安

    2011-01-01

    介绍了威布尔分布的计算过程,对Thales整体旋转式斯特林制冷机的可靠性试验数据进行了分析,比较了威布尔分布可靠性计算的数值解析法与图估法的计算结果,总结了Thales整体旋转式斯特林制冷机的可靠性计算过程和加速因子的确定,为国内同类产品可靠性试验方案和寿命计算方法提供借鉴.%This article described the calculation of Weibull Law and analyzed the experimental data on the reliability of Tha-les rotary Stirling cryocoolers. The comparison of calculating results between Weibull Law and figure - estimation method was conducted.It summerized the calculation process and speeding factor of Thales rotary Stirling cryocooler.

  11. Utilização da função pearson tipo V, Weibull e hiperbólica para modelagem da distribuição de diâmetros

    Directory of Open Access Journals (Sweden)

    Daniel Henrique Breda Binoti

    2013-09-01

    Full Text Available Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A função Pearson tipo V foi testada em três diferentes configurações, com três e dois parâmetros, e tendo o parâmetro de locação substituído pelo diâmetro mínimo da parcela. A aderência das funções aos dados foi comprovada pela aplicação do teste Kolmogorov-Sminorv (K-S. Todos os ajustes apresentaram aderência aos dados pelo teste KS. As funções Weibull e hiperbólica apresentaram desempenho superior ao demonstrado pela função Pearson tipo V.

  12. Analysis of distribution of critical current of bent-damaged Bi2223 composite tape

    Energy Technology Data Exchange (ETDEWEB)

    Ochiai, S; Okuda, H; Hojo, M [Graduate School of Engineering, Kyoto University, Yoshida, Sakyo-ku, Kyoto 606- 8501 (Japan); Sugano, M [Graduate School of Engineering, Kyoto University, Kyoto-Daigaku Katsura, Nishikyo-ku, Kyoto 615-8530 (Japan); Osamura, K [Research Institute for Applied Sciences, Sakyo-ku, Kyoto 606-8202 (Japan); Kuroda, T; Kumakura, H; Kitaguchi, H; Itoh, K; Wada, H, E-mail: shojiro.ochiai@materials.mbox.media.kyoto-u.ac.jp [National Institute for Materials Science, 1-2-1, Sengen, Tsukuba, Ibaraki 305-0047 (Japan)

    2011-10-29

    Distributions of critical current of damaged Bi2223 tape specimens bent by 0.6, 0.8 and 1.0% were investigated analytically with a modelling approach based on the correlation of damage evolution to distribution of critical current. It was revealed that the distribution of critical current is described by three parameter Weibull distribution function through the distribution of the tensile damage strain of Bi2223 filaments that determines the damage front in bent-composite tape. Also it was shown that the measured distribution of critical current values can be reproduced successfully by a Monte Carlo simulation using the distributions of tensile damage strain of filaments and original critical current.

  13. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  14. A NEW FAMILY OF GENERALIZED GAMMA DISTRIBUTION AND ITS APPLICATION

    Directory of Open Access Journals (Sweden)

    Satsayamon Suksaengrakcharoen

    2014-01-01

    Full Text Available The mixture distribution is defined as one of the most important ways to obtain new probability distributions in applied probability and several research areas. According to the previous reason, we have been looking for more flexible alternative to the lifetime data. Therefore, we introduced a new mixed distribution, namely the Mixture Generalized Gamma (MGG distribution, which is obtained by mixing between generalized gamma distribution and length biased generalized gamma distribution is introduced. The MGG distribution is capable of modeling bathtub-shaped hazard rate, which contains special sub-models, namely, the exponential, length biased exponential, generalized gamma, length biased gamma and length biased generalized gamma distributions. We present some useful properties of the MGG distribution such as mean, variance, skewness, kurtosis and hazard rate. Parameter estimations are also implemented using maximum likelihood method. The application of the MGG distribution is illustrated by real data set. The results demonstrate that MGG distribution can provide the fitted values more consistent and flexible framework than a number of distribution include important lifetime data; the generalized gamma, length biased generalized gamma and the three parameters Weibull distributions.

  15. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    Directory of Open Access Journals (Sweden)

    Ju Feng

    2015-04-01

    Full Text Available Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distributions of wind speed and wind direction, which is based on the parameters of sector-wise Weibull distributions and interpolations between direction sectors. It is applied to the wind measurement data at Horns Rev and three different joint distributions are obtained, which all fit the measurement data quite well in terms of the coefficient of determination . Then, the best of these joint distributions is used in the layout optimization of the Horns Rev 1 wind farm and the choice of bin sizes for wind speed and wind direction is also investigated. It is found that the choice of bin size for wind direction is especially critical for layout optimization and the recommended choice of bin sizes for wind speed and wind direction is finally presented.

  16. Modeling Size-number Distributions of Seeds for Use in Soil Bank Studies

    Institute of Scientific and Technical Information of China (English)

    Hugo Casco; Alexandra Soveral Dias; Luís Silva Dias

    2008-01-01

    Knowledge of soil seed banks is essential to understand the dynamics of plant populations and communities and would greatly benefit from the integration of existing knowledge on ecological correlations of seed size and shape. The present study aims to establish a feasible and meaningful method to describe size-number distributions of seeds in multi-species situations. For that purpose, size-number distributions of seeds with known length, width and thickness were determined by sequential sieving. The most appropriate combination of sieves and seeds dimensions was established, and the adequacy of the power function and the Weibull model to describe size-number distributions of spherical, non.spherical, and all seeds was investigated. We found that the geometric mean of seed length, width and thickness was the most adequate size estimator, providing shape-independent measures of seeds volume directly related to sieves mesh side, and that both the power function and the Weibuli model provide high quality descriptions of size-number distributions of spherical,non-spherical, and all seeds. We also found that, in spite of its slightly lower accuracy, the power function is, at this stage, a more trustworthy model to characterize size-number distributions of seeds in soil banks because in some Weibull equations the estimates of the scale parameter were not acceptable.

  17. Statistical Evidence for the Preference of Frailty Distributions with Regularly-Varying-at-Zero Densities

    DEFF Research Database (Denmark)

    Missov, Trifon I.; Schöley, Jonas

    to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions......Missov and Finkelstein (2011) prove an Abelian and its corresponding Tauberian theorem regarding distributions for modeling unobserved heterogeneity in fixed-frailty mixture models. The main property of such distributions is the regular variation at zero of their densities. According...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...

  18. Utilização da função pearson tipo V, Weibull e hiperbólica para modelagem da distribuição de diâmetros

    OpenAIRE

    Daniel Henrique Breda Binoti; Mayra Luiza Marques da Silva Binoti; Helio Garcia Leite

    2013-01-01

    Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A funç...

  19. Grey Correlation Analysis on Main Factors Influencing the Shape of Blasted Stockpile Based on Weibull Model%基于Weibull模型的爆堆形态主要影响因素的灰色关联分析

    Institute of Scientific and Technical Information of China (English)

    韩亮; 毕文广; 李红江; 师相; 李柄成

    2011-01-01

    露天矿爆破中爆堆形态的影响因素众多,为找出影响爆堆形态的主控因素,引入灰色关联理论,由于爆堆形态无法用数字参量表达,因此很难直接利用灰色关联理论对其进行计算,通过引入Weibull模型对实测爆堆形态曲线进行拟合计算,完成了爆堆形态参数的量化过程,并对黑岱沟露天煤矿的49组实例进行了灰色关联度计算,得到了各因素对爆堆形态影响的关联序列,并对其进行了分析。该研究对于露天矿爆堆形态的设计优化具有一定的指导意义。%There are many factors influencing the shape of blasted stockpile in the open -pit. In order to find the main influence factors, the grey correlation theory is introduced. However, it is difficult to calculate the shape of blasted stockpile directly using grey correlation theory due to it is unable to express the shape with digital parameters. In this paper, by leading into Weibull model, the fitting calculation of the actual shape curve of blasted stockpile was accomplished, which made quantification of parameters of blasted stockpile. The 49 examples from Heidaigou open -pit were calculated with grey correlation theory, correlativity sequence of influencing factors of the blasted stockpile shape was obtained, and the results were analyzed. The research has definited significance for optimizing the shape design of blasted stockpile in openpit.

  20. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    DEFF Research Database (Denmark)

    Feng, Ju; Shen, Wen Zhong

    2015-01-01

    Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distribu......Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint...... quite well in terms of the coefficient of determination R-2. Then, the best of these joint distributions is used in the layout optimization of the Horns Rev 1 wind farm and the choice of bin sizes for wind speed and wind direction is also investigated. It is found that the choice of bin size for wind...... direction is especially critical for layout optimization and the recommended choice of bin sizes for wind speed and wind direction is finally presented....

  1. A comprehensive study of distribution laws for the fragments of Ko\\v{s}ice meteorite

    CERN Document Server

    Gritsevich, Maria; Kohout, Tomáš; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

    2014-01-01

    In this study, we conduct a detailed analysis of the Ko\\v{s}ice meteorite fall (February 28, 2010), in order to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Ko\\v{s}ice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady and bimodal lognormal distributions are found to be the most appropriate for describing the Ko\\v{s}ice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Ko\\v{s}ice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 kg and 9 kg respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the...

  2. Statistical Distribution of Fatigue Life for Cast TiAl Alloy

    Directory of Open Access Journals (Sweden)

    WAN Wenjuan

    2016-08-01

    Full Text Available Statistic distribution of fatigue life data and its controls of cast Ti-47.5Al-2.5V-1.0Cr-0.2Zr (atom fraction/% alloy were investigated. Fatigue tests were operated by means of load-controlled rotating bending fatigue tests (R=-1 performed at a frequency of 100 Hz at 750 ℃ in air. The fracture mechanism was analyzed by observing the fracture surface morphologies through scanning electron microscope,and the achieved fatigue life data were analyzed by Weibull statistics. The results show that the fatigue life data present a remarkable scatter ranging from 103 to 106 cycles, and distribute mainly in short and long life regime. The reason for this phenomenon is that the fatigue crack initiators are different with different specimens. The crack initiators for short-life specimens are caused by shrinkage porosity, and for long-life ones are caused by bridged porosity interface and soft-oriented lamellar interface. Based on the observation results of fracture surface, two-parameter Weibull distribution model for fatigue life data can be used for the prediction of fatigue life at a certain failure probability. It has also shown that the shrinkage porosity causes the most detrimental effect to fatigue life.

  3. Analysis and Modelling of Extreme Wind Speed Distributions in Complex Mountainous Regions

    Science.gov (United States)

    Laib, Mohamed; Kanevski, Mikhail

    2016-04-01

    Modelling of wind speed distributions in complex mountainous regions is an important and challenging problem which interests many scientists from several fields. In the present research, high frequency (10 min) Swiss wind speed monitoring data (IDAWEB service, Meteosuisse) are analysed and modelled with different parametric distributions (Weibull, GEV, Gamma, etc.) using maximum likelihood method. In total, 111 stations placed in different geomorphological units and at different altitude (from 203 to 3580 meters) are studied. Then, this information is used for training machine learning algorithms (Extreme Learning Machines, Support vector machine) to predict the distribution at new places, potentially useful for aeolian energy generation. An important part of the research deals with the construction and application of a high dimensional input feature space, generated from digital elevation model. A comprehensive study was carried out using feature selection approach to get the best model for the prediction. The main results are presented as spatial patterns of distributions' parameters.

  4. Selection of statistical distributions for prediction of steam generator tube degradation

    Energy Technology Data Exchange (ETDEWEB)

    Stavropoulos, K.D.; Gorman, J.A. [Dominion Engr., Inc., McLean, VA (United States); Staehle, R.W. [Univ. of Minnesota, Minneapolis, MN (United States); Welty, C.S. Jr. [Electric Power Research Institute, Palo Alto, CA (United States)

    1992-12-31

    This paper presents the first part of a project directed at developing methods for characterizing and predicting the progression of degradation of PWR steam generator tubes. This first part covers the evaluation of statistical distributions for use in such analyses. The data used in the evaluation of statistical distributions included data for primary water stress corrosion cracking (PWSCC) at roll transitions and U-bends, and intergranular attack/stress corrosion cracking (IGA/SCC) at tube sheet and tube support plate crevices. Laboratory data for PWSCC of reverse U-bends were also used. The review of statistical distributions indicated that the Weibull distribution provides an easy to use and effective method. Another statistical function, the log-normal, was found to provide essentially equivalent results. Two parameter fits, without an initiation time, were found to provide the most reliable predictions.

  5. Estimating the Distribution of the Incubation Periods of Human Avian Influenza A(H7N9) Virus Infections

    Science.gov (United States)

    Virlogeux, Victor; Li, Ming; Tsang, Tim K.; Feng, Luzhao; Fang, Vicky J.; Jiang, Hui; Wu, Peng; Zheng, Jiandong; Lau, Eric H. Y.; Cao, Yu; Qin, Ying; Liao, Qiaohong; Yu, Hongjie; Cowling, Benjamin J.

    2015-01-01

    A novel avian influenza virus, influenza A(H7N9), emerged in China in early 2013 and caused severe disease in humans, with infections occurring most frequently after recent exposure to live poultry. The distribution of A(H7N9) incubation periods is of interest to epidemiologists and public health officials, but estimation of the distribution is complicated by interval censoring of exposures. Imputation of the midpoint of intervals was used in some early studies, resulting in estimated mean incubation times of approximately 5 days. In this study, we estimated the incubation period distribution of human influenza A(H7N9) infections using exposure data available for 229 patients with laboratory-confirmed A(H7N9) infection from mainland China. A nonparametric model (Turnbull) and several parametric models accounting for the interval censoring in some exposures were fitted to the data. For the best-fitting parametric model (Weibull), the mean incubation period was 3.4 days (95% confidence interval: 3.0, 3.7) and the variance was 2.9 days; results were very similar for the nonparametric Turnbull estimate. Under the Weibull model, the 95th percentile of the incubation period distribution was 6.5 days (95% confidence interval: 5.9, 7.1). The midpoint approximation for interval-censored exposures led to overestimation of the mean incubation period. Public health observation of potentially exposed persons for 7 days after exposure would be appropriate. PMID:26409239

  6. EVALUACIÓN DE PROCEDIMIENTOS DE AJUSTE ÓPTIMO DE TODOS LOS PARÁMETROS DE WEIBULL 3P PARA MODELAR LA ESTRUCTURA HORIZONTAL EN PLANTACIONES DE Pinus taeda

    Directory of Open Access Journals (Sweden)

    O. S. Vallejos-Barra

    2009-01-01

    Full Text Available Se generó un conjunto de procedimientos de cálculo que permitieran estimar los tres parámetros de la función de densidad de probabilidad Weibull 3P. Además, se estimaron y evaluaron los parámetros óptimos de esta función mediante los procedimientos de cálculo desarrollados, para modelar los diámetros a la altura del pecho de árboles de Pinus taeda. Los árboles fueron medidos durante ocho años en seis parcelas para cada una de las cinco densidades de plantación consideradas. Los parámetros de la Weibull 3P fueron estimados por cuatro métodos alternativos: máxima verosimilitud, momentos, percentiles e híbrido. Los procedimientos de optimización buscaban minimizar tanto el índice de error como los estadísticos de las pruebas de bondad de ajuste: Kolmogorov- Smirnov, Anderson-Darling, Kuiper, Cramer-Von Mises y Watson. Cuatro fueron los resultados principales de esta investigación. Primero, los métodos de estimación de parámetros y la edad de plantación afectaron el valor del parámetro de localización. Segundo, 45 % de los valores del parámetro de localización fueron negativos. En estos casos, se encontró una relación lineal altamente significativa entre los parámetros de localización, de escala y forma. Entonces, el efecto de un valor del parámetro de localización negativo fue compensado por el valor de los otros parámetros. Tercero, el método de percentiles y de máxima verosimilitud producen el menor y mayor valor del parámetro de localización, respectivamente. Cuarto, la mayor exactitud en el ajuste se logró con los métodos de estimación de parámetros de los percentiles y de los momentos. La mayor exactitud en el ajuste de la prueba Anderson-Darling se asoció al método de los momentos y el resto de las pruebas de bondad de ajuste con el método de los percentiles.

  7. Bayesian estimation of generalized exponential distribution under noninformative priors

    Science.gov (United States)

    Moala, Fernando Antonio; Achcar, Jorge Alberto; Tomazella, Vera Lúcia Damasceno

    2012-10-01

    The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional noninformative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.

  8. Application-dependent Probability Distributions for Offshore Wind Speeds

    Science.gov (United States)

    Morgan, E. C.; Lackner, M.; Vogel, R. M.; Baise, L. G.

    2010-12-01

    The higher wind speeds of the offshore environment make it an attractive setting for future wind farms. With sparser field measurements, the theoretical probability distribution of short-term wind speeds becomes more important in estimating values such as average power output and fatigue load. While previous studies typically compare the accuracy of probability distributions using R2, we show that validation based on this metric is not consistent with validation based on engineering parameters of interest, namely turbine power output and extreme wind speed. Thus, in order to make the most accurate estimates possible, the probability distribution that an engineer picks to characterize wind speeds should depend on the design parameter of interest. We introduce the Kappa and Wakeby probability distribution functions to wind speed modeling, and show that these two distributions, along with the Biweibull distribution, fit wind speed samples better than the more widely accepted Weibull and Rayleigh distributions based on R2. Additionally, out of the 14 probability distributions we examine, the Kappa and Wakeby give the most accurate and least biased estimates of turbine power output. The fact that the 2-parameter Lognormal distribution estimates extreme wind speeds (i.e. fits the upper tail of wind speed distributions) with least error indicates that not one single distribution performs satisfactorily for all applications. Our use of a large dataset composed of 178 buoys (totaling ~72 million 10-minute wind speed observations) makes these findings highly significant, both in terms of large sample size and broad geographical distribution across various wind regimes. Boxplots of R2 from the fit of each of the 14 distributions to the 178 boy wind speed samples. Distributions are ranked from left to right by ascending median R2, with the Biweibull having the closest median to 1.

  9. Weibull分布对数线性加速模型的可靠性估计方法%Reliability Estimation for Weibull Log-linear Accelerated Life Testing Model

    Institute of Scientific and Technical Information of China (English)

    刘婷

    2011-01-01

    Accelerated life tests,in which more than one stress is often involved, have become widely used in today's industries. The log-linear accelerated model was proposed to describe the relation between multiple-stress and product's lifetime, and Weibull log-linear accelerated models were established. Due to highly nonlinear and non-monotonic of the log likelihood function, the genetic algorithm could be adopted to determine the maximum likelihood estimates of accelerated model parameters, and then reliability assessment and lifetime prediction could be realized under various stress. Finally,simulation results illustrate reasonability of the method proposed in the paper.%针对Weibull分布产品的多应力加速试验,提出采用对数线性加速模型描述多个应力和产品寿命之间的定量关系,建立了Weibull分布对数线性加速试验的可靠性分析模型.鉴于似然函数的高度非线性和非单调特性,采用遗传算法得到了加速模型参数的极大似然估计,可以实现不同应力之间可靠性特性参数的相互转换.仿真算例验证了所提方法的有效性.

  10. Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Procaccia, H.; Villain, B. [Electricite de France (EDF), 93 - Saint-Denis (France); Clarotti, C.A. [ENEA, Casaccia (Italy)

    1996-12-31

    EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL`94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors). 10 refs.

  11. 寿命服从Weibull分布时求p-S-N曲线的极大似然法%The Maximum Likelihood Method for Getting the p-S-N Curve When Fatigue Life Follow Weibull Distribution

    Institute of Scientific and Technical Information of China (English)

    苏彦江

    2002-01-01

    针对疲劳寿命服从Weibull分布时,研究了材料p-S-N曲线方程拟合的极大似然法,将求解方程参数的问题化为求多元非线性函数的极小点问题.并用球墨铸铁材料的疲劳试验结果说明了该方法的工程应用,给出了球墨铸铁材料在中等寿命区的p-S-N曲线.

  12. Weibull分布产品零失效下的可靠性增长研究%Study on Renability Growth for Weibull Distributed Products under Zero Failure Data

    Institute of Scientific and Technical Information of China (English)

    冯静; 刘琦; 周经伦; 沙基昌

    2004-01-01

    提出了一种新的基于修正似然函数的可靠性增长分析方法,解决了现有的基于失效次数的可靠性增长模型不能解决的无失效数据情形下的可靠性增长分析问题.该方法首先利用数据合并技术和修正似然函数法得到各研制阶段结束时的可靠性水平,然后利用Duane模型研究系统瞬时失效率随研制试验延长而递减的可靠性增长规律,最后通过仿真实例说明了该方法的有效性.

  13. The Fiducial Inference for Repaired as Old Weibull Distributed Systems%威布尔型设备修如旧模型下可用度的Fiducial推断

    Institute of Scientific and Technical Information of China (English)

    于丹; 闫霞; 李国英

    2004-01-01

    本文对于威布尔型设备考虑一类修如旧模型,分别在维修时间是常数和是在有限个点上取值的随机变量两种情况下导出了设备在任意时刻可用度函数的表达式;基于设备寿命试验的完全数据,给出了威布尔分布在任意时刻可靠度的Fiducial分布,进而求出设备可用度的点估计和置信下限.最后进行了模拟研究,结果表明,该方法在小样本(n≤10)情况下置信下限的平均精度尚好,但稳定性较差,样本量n≥15时效果较好.

  14. Probability distribution of surface wind speed induced by convective adjustment on Venus

    Science.gov (United States)

    Yamamoto, Masaru

    2017-03-01

    The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.

  15. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  16. A New Function for Modelling Diameter Frequency Distribution in the Tropical Rain Forest of Xishuangbanna, Southwest of China

    Institute of Scientific and Technical Information of China (English)

    Lu Yuanchang; Lei Xiangdong; Jiang Lei

    2003-01-01

    Permanent plots in the montane tropical rain forests in Xishuangbanna, southwest China, were established, and different empirical models, based on observation data of these plots in 1992, were built to model diameter frequency distributions. The focus of this study is on predicting accuracy of stem number in the larger diameter classes, which is much more important than that of the smaller trees, from the view of forest management, and must be adequately considered in the modelling and estimate. There exist 3 traditional ways of modelling the diameter frequency distribution: the negative exponential function model, limiting line function model, and Weibull distribution model. In this study, a new model, named as the logarithmic J-shape function, together with the others, was experimented and was found as a more suitable model for modelling works in the tropical forests.

  17. Distributions of personal VOC exposures: a population-based analysis.

    Science.gov (United States)

    Jia, Chunrong; D'Souza, Jennifer; Batterman, Stuart

    2008-10-01

    Information regarding the distribution of volatile organic compound (VOC) concentrations and exposures is scarce, and there have been few, if any, studies using population-based samples from which representative estimates can be derived. This study characterizes distributions of personal exposures to ten different VOCs in the U.S. measured in the 1999--2000 National Health and Nutrition Examination Survey (NHANES). Personal VOC exposures were collected for 669 individuals over 2-3 days, and measurements were weighted to derive national-level statistics. Four common exposure sources were identified using factor analyses: gasoline vapor and vehicle exhaust, methyl tert-butyl ether (MBTE) as a gasoline additive, tap water disinfection products, and household cleaning products. Benzene, toluene, ethyl benzene, xylenes chloroform, and tetrachloroethene were fit to log-normal distributions with reasonably good agreement to observations. 1,4-Dichlorobenzene and trichloroethene were fit to Pareto distributions, and MTBE to Weibull distribution, but agreement was poor. However, distributions that attempt to match all of the VOC exposure data can lead to incorrect conclusions regarding the level and frequency of the higher exposures. Maximum Gumbel distributions gave generally good fits to extrema, however, they could not fully represent the highest exposures of the NHANES measurements. The analysis suggests that complete models for the distribution of VOC exposures require an approach that combines standard and extreme value distributions, and that carefully identifies outliers. This is the first study to provide national-level and representative statistics regarding the VOC exposures, and its results have important implications for risk assessment and probabilistic analyses.

  18. Use of the gamma distribution to represent monthly rainfall in Africa for drought monitoring applications

    Science.gov (United States)

    Husak, Gregory J.; Michaelsen, Joel C.; Funk, Christopher C.

    2007-01-01

    Evaluating a range of scenarios that accurately reflect precipitation variability is critical for water resource applications. Inputs to these applications can be provided using location- and interval-specific probability distributions. These distributions make it possible to estimate the likelihood of rainfall being within a specified range. In this paper, we demonstrate the feasibility of fitting cell-by-cell probability distributions to grids of monthly interpolated, continent-wide data. Future work will then detail applications of these grids to improved satellite-remote sensing of drought and interpretations of probabilistic climate outlook forum forecasts. The gamma distribution is well suited to these applications because it is fairly familiar to African scientists, and capable of representing a variety of distribution shapes. This study tests the goodness-of-fit using the Kolmogorov–Smirnov (KS) test, and compares these results against another distribution commonly used in rainfall events, the Weibull. The gamma distribution is suitable for roughly 98% of the locations over all months. The techniques and results presented in this study provide a foundation for use of the gamma distribution to generate drivers for various rain-related models. These models are used as decision support tools for the management of water and agricultural resources as well as food reserves by providing decision makers with ways to evaluate the likelihood of various rainfall accumulations and assess different scenarios in Africa. 

  19. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  20. Effect of particle size and distribution of the sizing agent on the carbon fibers surface and interfacial shear strength (IFSS) of its composites

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, R.L. [Open Project of State Key Laboratory Breeding Base for Mining Disaster Prevention and Control, Shandong University of Science and Technology (China); School of Materials Science and Engineering, Shandong University of Science and Technology, 266590 Qingdao (China); Liu, Y. [School of Materials Science and Engineering, Shandong University of Science and Technology, 266590 Qingdao (China); Huang, Y.D., E-mail: rlzhit@126.com [School of Chemical Engineering and Technology, State Key laboratory of Urban Water Resource and Environment Department of Applied Chemistry, Harbin Institute of Technology, 150001 Harbin (China); Liu, L. [School of Chemical Engineering and Technology, State Key laboratory of Urban Water Resource and Environment Department of Applied Chemistry, Harbin Institute of Technology, 150001 Harbin (China)

    2013-12-15

    Effect of particle size and distribution of the sizing agent on the performance of carbon fiber and carbon fiber composites has been investigated. Atomic force microscopy (AFM) and scanning electron microscopy (SEM) were used to characterize carbon fiber surface topographies. At the same time, the single fiber strength and Weibull distribution were also studied in order to investigate the effect of coatings on the fibers. The interfacial shear strength and hygrothermal aging of the carbon fiber/epoxy resin composites were also measured. The results indicated that the particle size and distribution is important for improving the surface of carbon fibers and its composites performance. Different particle size and distribution of sizing agent has different contribution to the wetting performance of carbon fibers. The fibers sized with P-2 had higher value of IFSS and better hygrothermal aging resistant properties.

  1. Distribution Structures

    NARCIS (Netherlands)

    Friedrich, H.; Tavasszy, L.A.; Davydenko, I.

    2013-01-01

    Distribution structures are important elements of the freight transportation system. Goods are routed via warehouses on their way from production to consumption. This chapter discusses drivers behind these structures, logistics decisions connected to distribution structures on the micro level, and

  2. Distributed Knight

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Damm, Christian Heide

    2005-01-01

    An extension of Knight (2005) that support distributed synchronous collaboration implemented using type-based publish/subscribe......An extension of Knight (2005) that support distributed synchronous collaboration implemented using type-based publish/subscribe...

  3. Effect of serum lipid level change on 10-year coronary heart risk distribution estimated by means of seven different coronary risk scores during one-year treatment.

    Science.gov (United States)

    Kojić, Nevena Eremić; Derić, Mirjana; Dejanović, Jadranka

    2014-01-01

    This study was done in order to evaluate the effect of serum levels of total cholesterol, triglycerides, low-density lipoprotein-cholesterol and high-density lipoprotein-cholesterol on 10-year coronary heart disease risk distribution change. This study included 110 subjects of both genders (71 female and 39 male), aged 29 to 73, treated at the Outpatient Department of Atherosclerosis Prevention, Centre for Laboratory Medicine, Clinical Centre Vojvodina. The 10-year coronary heart disease risk was estimated on first examination and after one-year treatment by means of Framingham, PROCAM and SCORE coronary risk scores and their modifications (Framingham Adult Treatment Panel III, Framingham Weibul, PROCAM NS and PROCAM Cox Hazards). Age, gender, systolic and diastolic blood pressure, smoking, positive family history and left ventricular hypertrophy are risk factors involved in the estimation of coronary heart disease besides lipid parameters. There were no significant differences in nutritional status, smoking habits, systolic and diastolic pressure, and no development of diabetes mellitus or cardiovascular incidents during one-year follow. However, a significant reduction in cholesterol level (p risk (Framingham- p Framingham ATP III- p Framingham Weibul- p SCORE- p risk category (Framingham- p Framingham ATP III- p Framingham Weibul- p SCORE- p risk at the beginning of the study. Our results show that the correction of lipid level after one-year treatment leads to a significant redistribution of 10-year coronary heart disease risk estimated by means of seven different coronary risk scores. This should stimulate patients and doctors to persist in prevention measures.

  4. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  5. A relationship between statistical time to breakdown distributions and pre-breakdown negative differential resistance at nanometric scale

    Energy Technology Data Exchange (ETDEWEB)

    Foissac, R. [STMicroelectronics, 850 rue Jean Monnet, 38926 Crolles Cedex (France); Univ. Grenoble Alpes, LTM, F-38000 Grenoble, France and CNRS, LTM, F-38000 Grenoble (France); Blonkowski, S.; Delcroix, P. [STMicroelectronics, 850 rue Jean Monnet, 38926 Crolles Cedex (France); Kogelschatz, M. [Univ. Grenoble Alpes, LTM, F-38000 Grenoble, France and CNRS, LTM, F-38000 Grenoble (France)

    2014-07-14

    Using an ultra-high vacuum Conductive atomic force microscopy (C-AFM) current voltage, pre-breakdown negative differential resistance (NDR) characteristics are measured together with the time dependent dielectric breakdown (TDDB) distributions of Si/SiON (1.4 and 2.6 nm thick). Those experimental characteristics are systematically compared. The NDR effect is modelled by a conductive filament growth. It is showed that the Weibull TDDB statistic distribution scale factor is proportional to the growth rate of an individual filament and then has the same dependence on the electric field. The proportionality factor is a power law of the ratio between the surfaces of the CAFM tip and the filament's top. Moreover, it was found that, for the high fields used in those experiments, the TDDB acceleration factor as the growth rate characteristic is proportional to the Zener tunnelling probability. Those observations are discussed in the framework of possible breakdown or forming mechanism.

  6. Particle size distributions and the sequential fragmentation/transport theory applied to volcanic ash

    Energy Technology Data Exchange (ETDEWEB)

    Wohletz, K.H. (Earth and Space Science Division Los Alamos National Laboratory, New Mexico (USA)); Sheridan, M.F. (Department of Geology, Arizona State University, Tempe (USA)); Brown, W.K. (Math/Science Division, Lassen College, Susanville, California (USA))

    1989-11-10

    The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: {ital n}({ital l})={ital kl}{sup {alpha}} exp(-{ital l}{beta}), where {ital n}({ital l}) represents the number of particles of diameter {ital l}, {ital l} is the normalized particle diameter, and {ital k}, {alpha}, and {beta} are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass {ital m}{prime}: {ital n}({ital x}, {ital m})={ital C} {integral}{integral} {ital n}({ital x}{prime}, {ital m}{prime}){ital p}({xi}) {ital dx}{prime} {ital dm}{prime}, where {ital x}{prime} denotes spatial location along a linear axis, {ital C} is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to {ital m}.

  7. Best fitting distributions for the standard duration annual maximum precipitations in the Aegean Region

    Directory of Open Access Journals (Sweden)

    Halil Karahan

    2013-03-01

    Full Text Available Knowing the properties like amount, duration, intensity, spatial and temporal variation etc… of precipitation which is the primary input of water resources is required for planning, design, construction and operation studies of various sectors like water resources, agriculture, urbanization, drainage, flood control and transportation. For executing the mentioned practices, reliable and realistic estimations based on existing observations should be made. The first step of making a reliable estimation is to test the reliability of existing observations. In this study, Kolmogorov-Smirnov, Anderson-Darling and Chi-Square goodness of distribution fit tests were applied for determining to which distribution the measured standard duration maximum precipitation values (in the years 1929-2005 fit in the meteorological stations operated by the Turkish State Meteorological Service (DMİ which are located in the city and town centers of Aegean Region. While all the observations fit to GEV distribution according to Anderson-Darling test, it was seen that short, mid-term and long duration precipitation observations generally fit to GEV, Gamma and Log-normal distribution according to Kolmogorov-Smirnov and Chi-square tests. To determine the parameters of the chosen probability distribution, maximum likelihood (LN2, LN3, EXP2, Gamma3, probability-weighted distribution (LP3,Gamma2, L-moments (GEV and least squares (Weibull2 methods were used according to different distributions.

  8. Statistical analysis of multilook polarimetric SAR data and terrain classification with adaptive distribution

    Science.gov (United States)

    Liu, Guoqing; Huang, ShunJi; Torre, Andrea; Rubertone, Franco S.

    1995-11-01

    This paper deals with analysis of statistical properties of multi-look processed polarimetric SAR data. Based on an assumption that the multi-look polarimetric measurement is a product between a Gamma-distributed texture variable and a Wishart-distributed polarimetric speckle variable, it is shown that the multi-look polarimetric measurement from a nonhomogeneous region obeys a generalized K-distribution. In order to validate this statistical model, two of its varied versions, multi-look intensity and amplitude K-distributions are particularly compared with histograms of the observed multi-look SAR data of three terrain types, ocean, forest-like and city regions, and with four empirical distribution models, Gaussian, log-normal, gamma and Weibull models. A qualitative relation between the degree of nonhomogeneity of a textured scene and the well-fitting statistical model is then empirically established. Finally, a classifier with adaptive distributions guided by the order parameter of the texture distribution estimated with local statistics is introduced to perform terrain classification, experimental results with both multi-look fully polarimetric data and multi-look single-channel intensity/amplitude data indicate its effectiveness.

  9. Strength distribution of fatigue crack initiation sites in an Al-Li alloy

    Science.gov (United States)

    Zhai, T.

    2006-10-01

    The stress-number of cycles to failure (S-N) curves were measured along the short-transverse (S) and rolling (L) directions of a hot-cross-rolled AA 8090 Al-Li alloy plate (45-mm thick). The alloy was solution heat treated, quenched in water, strained by 6 pct, and peak aged. Fatigue tests were carried out in four-point bend at room temperature, 20 Hz, R=0.1, in air. It was found that the fatigue limits in the S and L directions were 147 and 197 MPa, respectively. The crack population on the surface of a sample at failure increased with the applied stress level and was found to be a Weibull function of the applied maximum stress in this alloy. The strength distribution of fatigue weakest links, where cracks were initiated, was derived from the Weibull function determined by the experimental data. The fatigue weakest-link density was defined as the crack population per unit area at a stress level close to the ultimate tensile stress and can be regarded as a materials property. The density and strength distribution of fatigue weakest links were found to be markedly different between the L and S directions, accounting for the difference in fatigue limit between the directions in this alloy. They were also found to be different between S-L and S-T samples, and between L-T and L-S samples of this alloy, which could not be revealed by the corresponding S-N curves measured. These differences were due to the anisotropy of the microstructures in different directions in this alloy.

  10. Modelling diameter distributions of two-cohort forest stands with various proportions of dominant species: a two-component mixture model approach.

    Science.gov (United States)

    Podlaski, Rafał; Roesch, Francis A

    2014-03-01

    In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components, age cohorts and dominant species, and to assess the significance of differences between the mixture distributions and the kernel density estimates. The data consisted of plots from the Świętokrzyski National Park (Central Poland) and areas close to and including the North Carolina section of the Great Smoky Mountains National Park (USA; southern Appalachians). The fit of the mixture Weibull model to empirical DBH distributions had a precision similar to that of the mixture gamma model, slightly less accurate estimate was obtained with the kernel density estimator. Generally, in the two-cohort, two-storied, multi-species stands in the southern Appalachians, the two-component DBH structure was associated with age cohort and dominant species. The 1st DBH component of the mixture model was associated with the 1st dominant species sp1 occurred in young age cohort (e.g., sweetgum, eastern hemlock); and to a lesser degree, the 2nd DBH component was associated with the 2nd dominant species sp2 occurred in old age cohort (e.g., loblolly pine, red maple). In two-cohort, partly multilayered, stands in the Świętokrzyski National Park, the DBH structure was usually associated with only age cohorts (two dominant species often occurred in both young and old age cohorts). When empirical DBH distributions representing stands of complex structure are approximated using mixture models, the convergence of the estimation process is often significantly dependent on the starting strategies. Depending on the number of DBHs measured, three methods for choosing the initial values are recommended: min.k/max.k, 0.5/1.5/mean

  11. Statistical distributions

    CERN Document Server

    Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.

    2010-01-01

    A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

  12. A global survey on the seasonal variation of the marginal distribution of daily precipitation

    Science.gov (United States)

    Papalexiou, Simon Michael; Koutsoyiannis, Demetris

    2016-08-01

    To characterize the seasonal variation of the marginal distribution of daily precipitation, it is important to find which statistical characteristics of daily precipitation actually vary the most from month-to-month and which could be regarded to be invariant. Relevant to the latter issue is the question whether there is a single model capable to describe effectively the nonzero daily precipitation for every month worldwide. To study these questions we introduce and apply a novel test for seasonal variation (SV-Test) and explore the performance of two flexible distributions in a massive analysis of approximately 170,000 monthly daily precipitation records at more than 14,000 stations from all over the globe. The analysis indicates that: (a) the shape characteristics of the marginal distribution of daily precipitation, generally, vary over the months, (b) commonly used distributions such as the Exponential, Gamma, Weibull, Lognormal, and the Pareto, are incapable to describe "universally" the daily precipitation, (c) exponential-tail distributions like the Exponential, mixed Exponentials or the Gamma can severely underestimate the magnitude of extreme events and thus may be a wrong choice, and (d) the Burr type XII and the Generalized Gamma distributions are two good models, with the latter performing exceptionally well.

  13. Bayesian joint modeling of longitudinal measurements and time-to-event data using robust distributions.

    Science.gov (United States)

    Baghfalaki, T; Ganjali, M; Hashemi, R

    2014-01-01

    Distributional assumptions of most of the existing methods for joint modeling of longitudinal measurements and time-to-event data cannot allow incorporation of outlier robustness. In this article, we develop and implement a joint modeling of longitudinal and time-to-event data using some powerful distributions for robust analyzing that are known as normal/independent distributions. These distributions include univariate and multivariate versions of the Student's t, the slash, and the contaminated normal distributions. The proposed model implements a linear mixed effects model under a normal/independent distribution assumption for both random effects and residuals of the longitudinal process. For the time-to-event process a parametric proportional hazard model with a Weibull baseline hazard is used. Also, a Bayesian approach using the Markov-chain Monte Carlo method is adopted for parameter estimation. Some simulation studies are performed to investigate the performance of the proposed method under presence and absence of outliers. Also, the proposed methods are applied for analyzing a real AIDS clinical trial, with the aim of comparing the efficiency and safety of two antiretroviral drugs, where CD4 count measurements are gathered as longitudinal outcomes. In these data, time to death or dropout is considered as the interesting time-to-event outcome variable. Different model structures are developed for analyzing these data sets, where model selection is performed by the deviance information criterion (DIC), expected Akaike information criterion (EAIC), and expected Bayesian information criterion (EBIC).

  14. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  15. Distribution center

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    Distribution center is a logistics link fulfill physical distribution as its main functionGenerally speaking, it's a large and hiahly automated center destined to receive goods from various plants and suppliers,take orders,fill them efficiently,and deliver goods to customers as quickly as possible.

  16. Modeling human mortality using mixtures of bathtub shaped failure distributions.

    Science.gov (United States)

    Bebbington, Mark; Lai, Chin-Diew; Zitikis, Ricardas

    2007-04-07

    Aging and mortality is usually modeled by the Gompertz-Makeham distribution, where the mortality rate accelerates with age in adult humans. The resulting parameters are interpreted as the frailty and decrease in vitality with age. This fits well to life data from 'westernized' societies, where the data are accurate, of high resolution, and show the effects of high quality post-natal care. We show, however, that when the data are of lower resolution, and contain considerable structure in the infant mortality, the fit can be poor. Moreover, the Gompertz-Makeham distribution is consistent with neither the force of natural selection, nor the recently identified 'late life mortality deceleration'. Although actuarial models such as the Heligman-Pollard distribution can, in theory, achieve an improved fit, the lack of a closed form for the survival function makes fitting extremely arduous, and the biological interpretation can be lacking. We show, that a mixture, assigning mortality to exogenous or endogenous causes, using the reduced additive and flexible Weibull distributions, models well human mortality over the entire life span. The components of the mixture are asymptotically consistent with the reliability and biological theories of aging. The relative simplicity of the mixture distribution makes feasible a technique where the curvature functions of the corresponding survival and hazard rate functions are used to identify the beginning and the end of various life phases, such as infant mortality, the end of the force of natural selection, and late life mortality deceleration. We illustrate our results with a comparative analysis of Canadian and Indonesian mortality data.

  17. MAIL DISTRIBUTION

    CERN Multimedia

    Mail Office

    2001-01-01

    PLEASE NOTE: changed schedule for mail distribution on 21 December 2001 afternoon delivery will be one hour earlier than usual, delivery to LHC sites will take place late morning. Thank you for your understanding.

  18. Stable distributions

    CERN Document Server

    Janson, Svante

    2011-01-01

    We give some explicit calculations for stable distributions and convergence to them, mainly based on less explicit results in Feller (1971). The main purpose is to provide ourselves with easy reference to explicit formulas. (There are no new results.)

  19. Spatial distribution

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns......, depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly...

  20. Distributed optimality

    OpenAIRE

    Trommer, Jochen

    2005-01-01

    In dieser Dissertation schlage ich eine Synthese (Distributed Optimality, DO) von Optimalitätstheorie und einem derivationellen, morphologischem Asatz, Distributed Morphology (DM; Halle & Marantz, 1993) vor. Durch die Integration von OT in DM wird es möglich, Phänomene, die in DM durch sprachspezifische Regeln oder Merkmale von lexikalischen Einträge erfasst werden, auf die Interaktion von verletzbaren, universellen Constraints zurückzuführen. Andererseits leistet auch DM zwei substantielle B...

  1. Distributed Subtyping

    OpenAIRE

    Baehni, Sébastien; Barreto, Joao; Guerraoui, Rachid

    2006-01-01

    One of the most frequent operations in object-oriented programs is the "instanceof" test, also called the "subtyping" test or the "type inclusion" test. This test determines if a given object is an instance of some type. Surprisingly, despite a lot of research on distributed object-oriented languages and systems, almost no work has been devoted to the implementation of this test in a distributed environment. This paper presents the first algorithm to implement the "subtyping" test on an obje...

  2. Three-parameter discontinuous distributions for hydrological samples with zero values

    Science.gov (United States)

    Weglarczyk, Stanislaw; Strupczewski, Witold G.; Singh, Vijay P.

    2005-10-01

    A consistent approach to the frequency analysis of hydrologic data in arid and semiarid regions, i.e. the data series containing several zero values (e.g. monthly precipitation in dry seasons, annual peak flow discharges, etc.), requires using discontinuous probability distribution functions. Such an approach has received relatively limited attention. Along the lines of physically based models, the extensions of the Muskingum-based models to three parameter forms are considered. Using 44 peak flow series from the USGS data bank, the fitting ability of four three-parameter models was investigated: (1) the Dirac delta combined with Gamma distribution; (2) the Dirac delta combined with two-parameter generalized Pareto distribution; (3) the Dirac delta combined with two-parameter Weibull (DWe) distribution; (4) the kinematic diffusion with one additional parameter that controls the probability of the zero event (KD3). The goodness of fit of the models was assessed and compared both by evaluation of discrepancies between the results of both estimation methods (i.e. the method of moments (MOM) and the maximum likelihood method (MLM)) and using the log of likelihood function as a criterion. In most cases, the DWe distribution with MLM-estimated parameters showed the best fit of all the three-parameter models.

  3. Geographic location, network patterns and population distribution of rural settlements in Greece

    Science.gov (United States)

    Asimakopoulos, Avraam; Mogios, Emmanuel; Xenikos, Dimitrios G.

    2016-10-01

    Our work addresses the problem of how social networks are embedded in space, by studying the spread of human population over complex geomorphological terrain. We focus on villages or small cities up to a few thousand inhabitants located in mountainous areas in Greece. This terrain presents a familiar tree-like structure of valleys and land plateaus. Cities are found more often at lower altitudes and exhibit preference on south orientation. Furthermore, the population generally avoids flat land plateaus and river beds, preferring locations slightly uphill, away from the plateau edge. Despite the location diversity regarding geomorphological parameters, we find certain quantitative norms when we examine location and population distributions relative to the (man-made) transportation network. In particular, settlements at radial distance ℓ away from road network junctions have the same mean altitude, practically independent of ℓ ranging from a few meters to 10 km. Similarly, the distribution of the settlement population at any given ℓ is the same for all ℓ. Finally, the cumulative distribution of the number of rural cities n(ℓ) is fitted to the Weibull distribution, suggesting that human decisions for creating settlements could be paralleled to mechanisms typically attributed to this particular statistical distribution.

  4. Nonparametric Fine Tuning of Mixtures: Application to Non-Life Insurance Claims Distribution Estimation

    Science.gov (United States)

    Sardet, Laure; Patilea, Valentin

    When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.

  5. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study.

    Science.gov (United States)

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2013-11-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin-Rammler-Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a

  6. Fuel distribution

    Energy Technology Data Exchange (ETDEWEB)

    Tison, R.R.; Baker, N.R.; Blazek, C.F.

    1979-07-01

    Distribution of fuel is considered from a supply point to the secondary conversion sites and ultimate end users. All distribution is intracity with the maximum distance between the supply point and end-use site generally considered to be 15 mi. The fuels discussed are: coal or coal-like solids, methanol, No. 2 fuel oil, No. 6 fuel oil, high-Btu gas, medium-Btu gas, and low-Btu gas. Although the fuel state, i.e., gas, liquid, etc., can have a major impact on the distribution system, the source of these fuels (e.g., naturally-occurring or coal-derived) does not. Single-source, single-termination point and single-source, multi-termination point systems for liquid, gaseous, and solid fuel distribution are considered. Transport modes and the fuels associated with each mode are: by truck - coal, methanol, No. 2 fuel oil, and No. 6 fuel oil; and by pipeline - coal, methane, No. 2 fuel oil, No. 6 oil, high-Btu gas, medium-Btu gas, and low-Btu gas. Data provided for each distribution system include component makeup and initial costs.

  7. Damage Distributions

    DEFF Research Database (Denmark)

    Lützen, Marie

    2001-01-01

    the damage location, the damage sizes and the main particulars of the struck vessel. From the numerical simulation and the analyse of the damage statistics it is found that the current formulation from the IMO SLF 43/3/2 can be used as basis for determination of the p-, r-, and v-factors. Expressions...... and methods of calculation have been discussed. The damage distributions for the different vessels have been compared and analyses regarding relations between damage parameters and main particulars have been performed. The damage statistics collected in work package 1 have been analysed for relations between...... for the distribution of the non-dimensional damage location, the non-dimensional damage length and the non-dimensional penetrations have been derived. These distributions have been used as basis for a proposal for the p- and r-factors. Two proposals for the v-factor have been performed using the damage statistics...

  8. Distributed creativity

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    used within the literature and yet it has the potential to revolutionise the way we think about creativity, from how we define and measure it to what we can practically do to foster and develop creativity. Drawing on cultural psychology, ecological psychology and advances in cognitive science......This book challenges the standard view that creativity comes only from within an individual by arguing that creativity also exists ‘outside’ of the mind or more precisely, that the human mind extends through the means of action into the world. The notion of ‘distributed creativity’ is not commonly......, this book offers a basic framework for the study of distributed creativity that considers three main dimensions of creative work: sociality, materiality and temporality. Starting from the premise that creativity is distributed between people, between people and objects and across time, the book reviews...

  9. Distributed Logics

    Science.gov (United States)

    2014-10-03

    that must be woven into proofs of security statements. 03-10-2014 Memorandum Report Logic System-on-a-Chip Distributed systems 9888 ASDR&EAssistant...can be removed without damaging the logic. For all propositional letters p, E1. p ⊃ [r] p From now on, a distributed logic contains at least the...a ∈ x iff 〈h〉 ∈ x. These same definitions work for the canonical relation R for r : h y k where now a ∈ MA(k), [r] a, 〈r〉 a ∈ MA(h), x ∈ CF(h), and

  10. Not all nonnormal distributions are created equal: Improved theoretical and measurement precision.

    Science.gov (United States)

    Joo, Harry; Aguinis, Herman; Bradley, Kyle J

    2017-07-01

    We offer a four-category taxonomy of individual output distributions (i.e., distributions of cumulative results): (1) pure power law; (2) lognormal; (3) exponential tail (including exponential and power law with an exponential cutoff); and (4) symmetric or potentially symmetric (including normal, Poisson, and Weibull). The four categories are uniquely associated with mutually exclusive generative mechanisms: self-organized criticality, proportionate differentiation, incremental differentiation, and homogenization. We then introduce distribution pitting, a falsification-based method for comparing distributions to assess how well each one fits a given data set. In doing so, we also introduce decision rules to determine the likely dominant shape and generative mechanism among many that may operate concurrently. Next, we implement distribution pitting using 229 samples of individual output for several occupations (e.g., movie directors, writers, musicians, athletes, bank tellers, call center employees, grocery checkers, electrical fixture assemblers, and wirers). Results suggest that for 75% of our samples, exponential tail distributions and their generative mechanism (i.e., incremental differentiation) likely constitute the dominant distribution shape and explanation of nonnormally distributed individual output. This finding challenges past conclusions indicating the pervasiveness of other types of distributions and their generative mechanisms. Our results further contribute to theory by offering premises about the link between past and future individual output. For future research, our taxonomy and methodology can be used to pit distributions of other variables (e.g., organizational citizenship behaviors). Finally, we offer practical insights on how to increase overall individual output and produce more top performers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  12. Age Dating Fluvial Sediment Storage Reservoirs to Construct Sediment Waiting Time Distributions

    Science.gov (United States)

    Skalak, K.; Pizzuto, J. E.; Benthem, A.; Karwan, D. L.; Mahan, S.

    2015-12-01

    Suspended sediment transport is an important geomorphic process that can often control the transport of nutrients and contaminants. The time a particle spends in storage remains a critical knowledge gap in understanding particle trajectories through landscapes. We dated floodplain deposits in South River, VA, using fallout radionuclides (Pb-210, Cs-137), optically stimulated luminescence (OSL), and radiocarbon dating to determine sediment ages and construct sediment waiting time distributions. We have a total of 14 age dates in two eroding banks. We combine these age dates with a well-constrained history of mercury concentrations on suspended sediment in the river from an industrial release. Ages from fallout radionuclides document sedimentation from the early 1900s to the present, and agree with the history of mercury contamination. OSL dates span approximately 200 to 17,000 years old. We performed a standard Weibull analysis of nonexceedance to construct a waiting time distribution of floodplain sediment for the South River. The mean waiting time for floodplain sediment is 2930 years, while the median is approximately 710 years. When the floodplain waiting time distribution is combined with the waiting time distribution for in-channel sediment storage (available from previous studies), the mean waiting time shifts to approximately 680 years, suggesting that quantifying sediment waiting times for both channel and floodplain storage is critical in advancing knowledge of particle trajectories through watersheds.

  13. MAIL DISTRIBUTION

    CERN Multimedia

    J. Ferguson

    2002-01-01

    Following discussions with the mail contractor and Mail Service personnel, an agreement has been reached which permits deliveries to each distribution point to be maintained, while still achieving a large proportion of the planned budget reduction in 2002. As a result, the service will revert to its previous level throughout the Laboratory as rapidly as possible. Outgoing mail will be collected from a single collection point at the end of each corridor. Further discussions are currently in progress between ST, SPL and AS divisions on the possibility of an integrated distribution service for internal mail, stores items and small parcels, which could lead to additional savings from 2003 onwards, without affecting service levels. J. Ferguson AS Division

  14. Quasihomogeneous distributions

    CERN Document Server

    von Grudzinski, O

    1991-01-01

    This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.

  15. Distributed Games

    OpenAIRE

    Dov Monderer; Moshe Tennenholtz

    1997-01-01

    The Internet exhibits forms of interactions which are not captured by existing models in economics, artificial intelligence and game theory. New models are needed to deal with these multi-agent interactions. In this paper we present a new model--distributed games. In such a model each players controls a number of agents which participate in asynchronous parallel multi-agent interactions (games). The agents jointly and strategically control the level of information monitoring by broadcasting m...

  16. Distributed scheduling

    OpenAIRE

    Toptal, Ayşegül

    1999-01-01

    Ankara : Department of Industrial Engineering and the Institute of Engineering and Science of Bilkent Univ., 1999. Thesis (Master's) -- Bilkent University, 1999. Includes bibliographical references. Distributed Scheduling (DS) is a new paradigm that enables the local decisionmakers make their own schedules by considering local objectives and constraints within the boundaries and the overall objective of the whole system. Local schedules from different parts of the system are...

  17. Influence of Flexible Crop Yield Distribution on Crop Insurance Premium Rate——A Case Study on Cotton Insurance in Three Areas of Xinjiang

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Based on theoretical analysis of crop risk and premium rate setting,we take the case of premium rate setting of insurance on cotton yield in Shache County,Shaya County and Aksu City of Xinjiang.Using parametric methods and insurance actuarial technique,we select the optimal model for risk fitting of cotton yield in three areas;compare the premium rate calculated accurately under four risk distribution assumptions of cotton yield,and the rational premium rate,to analyze the impact of risk distribution of crop yield on premium rate setting.The empirical results show that the Logistic distribution is the optimal distribution for risk fitting of cotton yield in three areas;the rational net premium rate of cotton insurance in three areas is 7.62%,6.32% and 4.96%,respectively;there are errors in premium rate setting under assumptions of normal distribution,normalized skew distribution and Weibull distribution,ranging from 0.2 percentage points to 8 percentage points.Thus,it indicates that the selection of risk distribution model of yield directly affects the accuracy of premium rate setting of crops,and the key to accurate premium rate setting of crops lies in correct selection of risk distribution model of yield.

  18. Distribution switchgear

    CERN Document Server

    Stewart, Stan

    2004-01-01

    Switchgear plays a fundamental role within the power supply industry. It is required to isolate faulty equipment, divide large networks into sections for repair purposes, reconfigure networks in order to restore power supplies and control other equipment.This book begins with the general principles of the Switchgear function and leads on to discuss topics such as interruption techniques, fault level calculations, switching transients and electrical insulation; making this an invaluable reference source. Solutions to practical problems associated with Distribution Switchgear are also included.

  19. Mail distribution

    CERN Multimedia

    2007-01-01

    Please note that starting from 1 March 2007, the mail distribution and collection times will be modified for the following buildings: 6, 8, 9, 10, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 29, 69, 40, 70, 101, 102, 109, 118, 152, 153, 154, 155, 166, 167, 169, 171, 174, 261, 354, 358, 576, 579 and 580. Complementary Information on the new times will be posted on the entry doors and left in the mail boxes of each building. TS/FM Group

  20. Beyond word frequency: bursts, lulls, and scaling in the temporal distributions of words.

    Directory of Open Access Journals (Sweden)

    Eduardo G Altmann

    Full Text Available BACKGROUND: Zipf's discovery that word frequency distributions obey a power law established parallels between biological and physical processes, and language, laying the groundwork for a complex systems perspective on human communication. More recent research has also identified scaling regularities in the dynamics underlying the successive occurrences of events, suggesting the possibility of similar findings for language as well. METHODOLOGY/PRINCIPAL FINDINGS: By considering frequent words in USENET discussion groups and in disparate databases where the language has different levels of formality, here we show that the distributions of distances between successive occurrences of the same word display bursty deviations from a Poisson process and are well characterized by a stretched exponential (Weibull scaling. The extent of this deviation depends strongly on semantic type -- a measure of the logicality of each word -- and less strongly on frequency. We develop a generative model of this behavior that fully determines the dynamics of word usage. CONCLUSIONS/SIGNIFICANCE: Recurrence patterns of words are well described by a stretched exponential distribution of recurrence times, an empirical scaling that cannot be anticipated from Zipf's law. Because the use of words provides a uniquely precise and powerful lens on human thought and activity, our findings also have implications for other overt manifestations of collective human dynamics.

  1. Initiation and propagation life distributions of fatigue cracks and the life evaluation in high cycle fatigue of ADI; ADI zai no ko cycle hiro kiretsu hassei shinten jumyo bunpu tokusei to jumyo hyoka

    Energy Technology Data Exchange (ETDEWEB)

    Ochi, Y.; Ishii, A. [University of Electro Communications, Tokyo (Japan); Ogata, T. [Hitachi Metals, Ltd., Tokyo (Japan); Kubota, M. [Kyushu University, Fukuoka (Japan). Faculty of Engineering

    1997-10-15

    Rotating bending fatigue tests were carried out on austempered ductile cast iron (ADI) in order to investigate the statistical properties of life distributions of crack initiation and propagation, and also the evaluation of fatigue life. The results are summarized as follows: (1) The size of crack initiation sites of the material was represented by a Weibull distribution without regarding to the kinds of crack initiation sites such as microshrinkage and graphite grain. The crack initiation life scattered widely, but the scatter became much smaller as soon as the cracks grew. (2) The crack propagation life Nac which was defined as the minimum crack propagation rate showed lower scatter than the crack initation life. (3) The fatigue life of the material was evaluated well by Nac and the propagation rate after Nac. It was clear that the fatigue life of ductile cast iron was goverened by the scatter of Nac. 8 refs., 13 figs., 4 tabs.

  2. Accelerated corrosion test and corrosion failure distribution model of aircraft structural aluminum alloy

    Institute of Scientific and Technical Information of China (English)

    LIU Wen-lin; MU Zhi-tao; JIN Ping

    2006-01-01

    Based on corrosion damage data of 10 years for a type of aircraft aluminum alloy, the statistical analysis was conducted by Gumbel, Normal and two parameters Weibull distribution function. The results show that aluminum alloy structural member has the corrosion history of pitting corrosion-intergranular corrosion-exfoliation corrosion, and the maximum corrosion depth is in conformity to normal distribution. The accelerated corrosion test was carried out with the complied equivalent airport accelerated environment spectrum. The corrosion damage failure modes of aluminum alloy structural member indicate that the period of validity of the former protective coating is about 2.5 to 3 years, and that of the novel protective coating is about 4.0 to 4.5 years. The corrosion kinetics law of aluminum spar flange was established by fitting corrosion damage test data. The law indicates two apparent corrosion stages of high strength aluminum alloy section material: pitting corrosion and intergranular corrosion/exfoliation corrosion.The test results agree with the statistical fit result of corrosion data collected from corrosion member in service. The fractional error is 5.8% at the same calendar year. The accelerated corrosion test validates the corrosion kinetics law of aircraft aluminum alloy in service.

  3. Computer Simulation of Fiber Length and Width Distribution for Two Poplar Woods

    Institute of Scientific and Technical Information of China (English)

    ZHANGDongmei; HOUZhuqiang; GUANNing

    2004-01-01

    Computer simulation was carried out on fiber length and width for plantation-grown Chinese white poplar (Populus tomentosa Cart. clone) and plantation-grown poplar 1-72 (P. x eurumericana (Dode) Guiner cv.). Skewness and kurtosis of measured results exhibited that distributions of the fiber length and width departured from normal distribution. Three-parameter Weibull density function was used in this investigation and the corresponding program was written with Turbo C. The results showed that profiles of simulated length and width histograms were similar to ones of measured histograms, and that there was a pretty good agreement between simulated and measured means of fiber length and width. There was a little influence on the simulated means from seed used in random number generator and number of simulated variables. That indicated that the simulation was steady when the seed and the number were altered. Different histograms can be obtained with different values of the location, the shape, and the scale parameter corresponding to different values of the minimum, the mean, and the standard deviation for fiber length and width. The simulation presented here can be used as a tool for the studies on the variations in fiber morphology.

  4. 基于威布尔比例故障率模型的装备检测间隔期多属性模糊决策方法%Multi-attribute Fuzzy Decision-making Method of Test Interval Based on Weibull Proportional Hazards Model

    Institute of Scientific and Technical Information of China (English)

    张仕新; 刘义乐; 陈杰翔

    2012-01-01

    Based on the research of Weibull proportional hazards model,the test interval is calculated which takes the acceptable failure rate as constraints.Because the utilization of equipments is affected by multiple attributes such as test expense,failure risk and operational availability,stop duration and other factors,a weighted project compromise method is used to establish a multi-attribute test interval decision-making mode to realize the optimal decision-making of state test interval under multi-factor conditions.Finally,the applicability of the model is validated with the analysis of examples.%在对威布尔比例故障率模型进行研究的基础上,以可接受的故障风险为约束,计算了装备的检测间隔期。由于装备使用受到故障风险、检测费用、可用度及停机时间等多属性影响,运用基于加权投影折中法建立了模糊多属性状态检测周期决策模型,实现了多因素条件下状态检测间隔期的综合优化决策。最后,通过实例分析验证了该模型的适用性。

  5. The Distribution Model of CNC System to Failure Based on Maximum Likelihood Estimation%基于极大似然估计的数控系统故障分布模型

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    威布尔分布被广泛用于可靠性工程和寿命数据的分析中。针对两参数威布尔分布,建立基于极大似然法的参数估计模型,采用二阶收敛 Newton-Raphson 迭代法求解威布尔分布的尺寸参数和形状参数。迭代求解过程中,利用 Matlab 图形,初步选取似然函数曲线在零值点附近的区域作为初始值的区间,并根据 Newton-Raphson 迭代法收敛的充分条件进一步确定迭代初值的选取范围。通过 matlab 绘制迭代趋势三维图,证明与迭代计算结果相符。通过比较,证实本参数估计模型和 Newton-Raphson 迭代求解法更加精确有效。%Owing to the fact that the Weibull distribution is frequently applied for reliability engineering and lifespan data analysis,the paper established the parameter estimation model using maximum likelihood estimation for the dual-parametric Weibull distribution.it then used second-order convergent Newton-Raphson iteration method to solve the MLE of two-parameter Weibull distribution,which has scale param-eter and shape parameter.In the iteration process,the area around the zero point of likelihood function curve was preliminarily selected as the range of the initial value based on the likelihood function image which was plotted by Matlab,and according to the sufficient conditions for the convergence of Newton-Raphson iteration method to further determine the scope of the iterative initial value.Iteration trend three-dimensional image which was plotted by Matlab proves to be consistent with the results of iterative calcula-tion.Finally,by comparison,this parameter estimation model and the Newton-Raphson iterative solution method were proved to be more accurate and efficient.

  6. The Folded t Distribution

    OpenAIRE

    Psarakis, Stelios; Panaretos, John

    1990-01-01

    Measurements are frequently recorder without their algebraic sign. As a consequence the underlying distribution of measurements is replaced by a distribution of absolute measurements. When the underlying distribution is t the resulting distribution is called the “folded-t distribution”. Here we study this distribution, we find the relationship between the folded-t distribution and a special case of the folded normal distribution and we derive relationships of the folded-t distribution to othe...

  7. Rainfall extremes: Toward reconciliation after the battle of distributions.

    Science.gov (United States)

    Serinaldi, Francesco; Kilsby, Chris G

    2014-01-01

    [1] This study attempts to reconcile the conflicting results reported in the literature concerning the behavior of peak-over-threshold (POT) daily rainfall extremes and their distribution. By using two worldwide data sets, the impact of threshold selection and record length on the upper tail behavior of POT observations is investigated. The rainfall process is studied within the framework of generalized Pareto (GP) exceedances according to the classical extreme value theory (EVT), with particular attention paid to the study of the GP shape parameter, which controls the heaviness of the upper tail of the GP distribution. A twofold effect is recognized. First, as the threshold decreases, and nonextreme values are progressively incorporated in the POT samples, the variance of the GP shape parameter reduces and the mean converges to positive values denoting a tendency to heavy tail behavior. Simultaneously, the EVT asymptotic hypotheses are less and less realistic, and the GP asymptote tends to be replaced by the Weibull penultimate asymptote whose upper tail is exponential but apparently heavy. Second, for a fixed high threshold, the variance of the GP shape parameter reduces as the record length (number of years) increases, and the mean values tend to be positive, thus denoting again the prevalence of heavy tail behavior. In both cases, i.e., threshold selection and record length effect, the heaviness of the tail may be ascribed to mechanisms such as the blend of extreme and nonextreme values, and fluctuations of the parent distributions. It is shown how these results provide a link between previous studies and pave the way for more comprehensive analyses which merge empirical, theoretical, and operational points of view. This study also provides several ancillary results, such as a set of formulae to correct the bias of the GP shape parameter estimates due to short record lengths accounting for uncertainty, thus avoiding systematic underestimation of extremes which

  8. Statistical distribution of elapsed times and distances of seismic events: the case of the Southern Spain seismic catalogue

    Directory of Open Access Journals (Sweden)

    M. D. Martínez

    2005-01-01

    Full Text Available Several empiric cumulative distributions of elapsed times and distances between seismic events occurred in the Southern Iberian Peninsula from 1985 to 2000 (data extracted from the seismic catalogue of the Andalusian Institute of Geophysics are investigated. Elapsed times and distances between consecutive seismic events of the whole catalogue, taking into account threshold magnitudes of 2.5, 3.0, 3.5 and 4.0, and of five seismic crises, without distinguishing magnitudes, are investigated. Additionally, the series of distances and elapsed times from the main event to every aftershock are also analysed for the five seismic crises. Even though a power law is sometimes a satisfactory model for the cumulative distribution of elapsed times and distances between seismic events, in some cases a fit with a Weibull distribution for elapsed times performs better. It is worth of mention that, in the case of the seismic crises, the fit achieved by the power law is sometimes improved when it is combined with a logarithmic law. The results derived might be a contribution to a better representation of the seismic activity by means of models that could be based on random-walk processes.

  9. Product Distributions for Distributed Optimization. Chapter 1

    Science.gov (United States)

    Bieniawski, Stefan R.; Wolpert, David H.

    2004-01-01

    With connections to bounded rational game theory, information theory and statistical mechanics, Product Distribution (PD) theory provides a new framework for performing distributed optimization. Furthermore, PD theory extends and formalizes Collective Intelligence, thus connecting distributed optimization to distributed Reinforcement Learning (FU). This paper provides an overview of PD theory and details an algorithm for performing optimization derived from it. The approach is demonstrated on two unconstrained optimization problems, one with discrete variables and one with continuous variables. To highlight the connections between PD theory and distributed FU, the results are compared with those obtained using distributed reinforcement learning inspired optimization approaches. The inter-relationship of the techniques is discussed.

  10. Particle size distributions and the sequential fragmentation/transport theory applied to volcanic ash

    Science.gov (United States)

    Wohletz, K. H.; Sheridan, M. F.; Brown, W. K.

    1989-11-01

    The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: n(l) = klα exp (-lβ), where n(l) represents the number of particles of diameter l, l is the normalized particle diameter, and k, α, and β are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass m': n(x, m) = C ∫∫ n(x', m')p(ξ)dx' dm', where x' denotes spatial location along a linear axis, C is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to m. We show that the probability function that models the production of particles of different size from an initial mass and sorts that distribution, p(ξ), is related to mg, where g (noted as γ for fragmentation processes) is a free parameter that determines the location, breadth, and skewness of the distribution; g(γ) must be greater than -1, and it increases from that value as the distribution matures with greater number of sequential steps in the fragmentation or transport process; γ is expected to be near -1 for "sudden" fragmentation mechanisms such as single-event explosions and transport mechanisms that are functionally dependent upon particle mass. This free parameter will be more positive for evolved fragmentation mechanisms such as ball milling and complex transport processes such as saltation. The SFT

  11. Size effect on strength and lifetime probability distributions of quasibrittle structures

    Indian Academy of Sciences (India)

    Zdeněk P Bažant; Jia-Liang Le

    2012-02-01

    Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufficient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a refined theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.

  12. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  13. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  14. An analysis of the effect of temperature on the pattern of wind energy distribution in the Caribbean region

    Directory of Open Access Journals (Sweden)

    Abrahams Mwasha

    2012-07-01

    Full Text Available The exploitation of the wind energy resource is expected to have a key role in climate change mitigation in the Caribbean region. However, wind energy is also affected by climate change. The availability and reliability of wind energy depend on the climate conditions. An evaluation based on Weibull Distribution Model is done on the average monthly wind speed variation at a standard height of 10m over a ten year period in the island of Trinidad. The variations in the Power Density Per Unit Area of the wind are examined in relation to temperature changes for the last ten year period. The temperature is examined since increased temperatures provide significant evidence of global warming. In this paper, it was found that in Piarco, Trinidad, the highest average annual Power Density Per Unit Area is 14.42Wm-2 and occurred in the year 2004 at an annual average temperature 32.1oC. By comparing these values with the Power Densities per Unit Area and temperatures for other years, the temperature did not have any significant impact on the distribution of wind energy. As such, the wind power potential in Trinidad is not jeopardized by climate change.

  15. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    Science.gov (United States)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  16. The Kumaraswamy Laplace Distribution

    Directory of Open Access Journals (Sweden)

    Manal Mohamed Nassar

    2016-12-01

    Full Text Available A generalized Laplace distribution using the Kumaraswamy distribution is introduced. Different structural properties of this new distribution are derived, including the moments, and the moment generating function. We discuss maximum likelihood estimation of the model parameters and obtain the observed and expected information matrix. A real data set is used to compare the new model with widely known distributions.

  17. the impact of distribut impact of distribut impact of distributed ...

    African Journals Online (AJOL)

    eobe

    Keywords: distributed generation; wind energy; integration of renewable sources and technical losses. 1. .... Against this background, the objectives of this paper .... this is the subject of further research. Also the .... biomass, geothermal etc.

  18. Remuestreo Bootstrap y Jackknife en confiabilidad: Caso Exponencial y Weibull

    Directory of Open Access Journals (Sweden)

    Javier Ramírez-Montoya

    2016-01-01

    Full Text Available Se comparan los métodos de remuestreo Bootstrap-t y Jackknife delete I y delete II, utilizando los estimadores no paramétricos de Kaplan-Meier y Nelson-Aalen, que se utilizan con frecuencia en la práctica, teniendo en cuenta diferentes porcentajes de censura, tamaños de muestra y tiempos de interés. La comparación se realiza vía simulación, mediante el error cuadrático medio.

  19. Weibull Analysis and Area Scaling for Infrared Window Materials (U)

    Science.gov (United States)

    2016-08-01

    Published by ..................................................................... Technical Communication Office Collation ...are ground and polished by the same methods used to make the window. Even if machining of coupons is matched as well as possible to that of the

  20. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  1. Generalized Lindley Distribution

    Directory of Open Access Journals (Sweden)

    H. Zakerzadeh

    2009-06-01

    Full Text Available In this paper, we introduce a three–parameter generalization of the Lindley distribution. This includes as special cases the exponential and gamma distributions. The distribution exhibits decreasing, increasing and bathtub hazard rate depending on its parameters. We study various properties of the new distribution and provide numerical examples to show the flexibility of the model. We also derive a bivariate version of the proposed distribution.

  2. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  3. CHARACTERISTICS OF FRAGMENT SIZE DISTRIBUTION OF DUCTILE MATERIALS FRAGMENTIZED UNDER HIGH STRAINRATE TENSION%韧性材料冲击拉伸碎裂中的碎片尺寸分布规律

    Institute of Scientific and Technical Information of China (English)

    郑宇轩; 陈磊; 胡时胜; 周风华

    2013-01-01

    利用有限元方法模拟韧性金属圆环高速膨胀过程中的碎裂过程,获得不同初始膨胀速度下碎片的样本集合.通过对碎片的尺寸进行统计分析发现:(1)无论初始膨胀速度如何,碎片的归一化尺寸分布具有相似性,可以用一个具有初始阈值的Weibull分布描述,近似地,这个分布还可以简化为Rayleigh分布;(2)碎片尺寸的累积分布曲线呈现阶梯特性,表现出较明显的“量子化”特性.在上述发现基础上,建立一个Monte-Carlo模型:碎裂点来自于颈缩点,颈缩之间的间距满足某种连续的Weibull分布,而碎片的尺寸为随机的若干个颈缩间距之和.概率模拟表明:除非早期的颈缩间距分布很宽,否则选择的离散性必然导致碎片尺寸分布呈现某种量子化特性.采用L04工业纯铝和无氧铜试件进行了爆炸膨胀碎裂实验,回收得到的碎片尺寸分布结果与理论分析基本一致.%Finite Element Method has been used to simulate the fracture and fragmentations of ductile metallic rings undergoing high rate expansions.In this paper,the numerical fragments obtained from the FEM simulations were collected for statistical analysis.It is found that:(1) The cumulative distributions of the normalized fragment sizes at different initial expansion velocities are similar,and collectively the fragment size distributions are modeled as a Weibull distribution with an initial threshold.Approximately,this distribution can be further simplified as a Rayleigh distribution,which is the special case with the Weibull parameter to be 2; (2) The cumulative distribution of the fragment sizes exhibits a step-like nature,which means that the fragment sizes may be “quantized”.A Monte-Carlo model is established to describe the origination of such quantization.In the model,the fractures occur at the sites where the tensioned material necks.The spacing of the necking sites follows a narrow Weibull distribution.As the fragment

  4. A Comparison Study on the Performances Of X , EWMA and CUSUM Control Charts for Skewed Distributions Using Weighted Standard Deviations Methods

    Directory of Open Access Journals (Sweden)

    Abdu. M. A. Atta

    2011-12-01

    Full Text Available In many statistical process control (SPC applications, the ease of use of control charts leads to ignoring the fact that the process population of the quality characteristic being measured may be highly skewed. However, in many situations, the normality assumption is usually violated. Among the recent heuristic charts for skewed distributions proposed in the literature are those based on the weighted standard deviation (WSD method. Thus, this paper compares the performances of certain WSD charts, such as WSD X , WSD Exponential weighted moving Average (WSDEWMA and WSD Cumulative Sum (WSD-CUSUM charts for skewed distributions. The skewed distributions being considered are weibull, gamma and lognormal. The false alarm and mean shift detection rates were computed so as to evaluate the performances of the WSD charts. The WSD X chart was found to have the lowest false alarm rate in cases of known and unknown parameters. Moreover, when parameters are known and unknown, the WSD-CUSUM provided the highest mean shift detection rates. The chart with the lowest false alarm and the highest mean shift detection rates for most level of skewness and sample size, n is assumed to be have a better performance.

  5. Time Gap Modeling Using Mixture Distributions under Mixed Traffic Conditions%基于复合分布的混合交通时间间隔模型

    Institute of Scientific and Technical Information of China (English)

    DUBEY Subodh Kant; PONNU Balaji; ARKATKAR Shriniwas S

    2013-01-01

    The time-gap data have been modeled through non-composite distribution up to a flow level of 1 800 vph.It has been found that these models are not capable of modeling time gap data at higher flow levels.Some composite distributions have been proposed to overcome this problem.But,due to the fact that the calibration of model parameters used in composite distributions is tedious,there use may be relatively limited.In this paper,five mixture models namely Exponential+Extreme-value (EEV),Lognormal + Extreme-value (LEV),Weibull +Extreme-value (WEV),Weibull+Lognormal (WLN) and Exponential+ Lognormal (ELN) have been used to model time gap data for flows ranging from 1 900 vph to 4 100 vph.Two types of goodness-of-fit tests namely cumulative distribution function (CDF) based and two-sample (Cramer-von Mises test) & K-sample (Anderson-Darling test) based tests were performed.Atnoong all the five models,Weibull+ Extreme Value was found to be the best mixture model for modeling time gap data as it performed consistently well in Cramer-von Mises test and K-sample Anderson-Darling test.%非复合分布模型可用于分析交通流量达1 800 vph的车辆时间间隔,但并不适用于更高交通流量的情况.为解决此类问题,提出了一些基于复合分布的模型.但这类模型的参数标定过程复杂,在一定程度上限制了其应用.针对流量介于1 900 vph到4 100 vph 的车辆时间间隔,本文分别采用5种复合分布模型进行分析,即指数-极值分布(EEV)、对数正态-极值分布(LEV)、威布尔-极值分布(WEV)、威布尔-对数正态分布(WEN)和指数-对数正态分布(ELN).然后采用两种方法进行拟合优度检验—基于累计函数分布检验(CDF)和双样本(Cramer-von Mises)&K样本(Anderson-Darling)检验.结果表明,在分析车辆时间间隔方面,威布尔-极值分布(WEV)是最佳的复合分布模型,在Cramer-von Mises检验和K样本Anderson-Darling检验中均具有良好的一致性.

  6. Modified Slash Lindley Distribution

    Directory of Open Access Journals (Sweden)

    Jimmy Reyes

    2017-01-01

    Full Text Available In this paper we introduce a new distribution, called the modified slash Lindley distribution, which can be seen as an extension of the Lindley distribution. We show that this new distribution provides more flexibility in terms of kurtosis and skewness than the Lindley distribution. We derive moments and some basic properties for the new distribution. Moment estimators and maximum likelihood estimators are calculated using numerical procedures. We carry out a simulation study for the maximum likelihood estimators. A fit of the proposed model indicates good performance when compared with other less flexible models.

  7. Hyperfinite Representation of Distributions

    Indian Academy of Sciences (India)

    J Sousa Pinto; R F Hoskins

    2000-11-01

    Hyperfinite representation of distributions is studied following the method introduced by Kinoshita [2, 3], although we use a different approach much in the vein of [4]. Products and Fourier transforms of representatives of distributions are also analysed.

  8. Distributed computer control systems

    Energy Technology Data Exchange (ETDEWEB)

    Suski, G.J.

    1986-01-01

    This book focuses on recent advances in the theory, applications and techniques for distributed computer control systems. Contents (partial): Real-time distributed computer control in a flexible manufacturing system. Semantics and implementation problems of channels in a DCCS specification. Broadcast protocols in distributed computer control systems. Design considerations of distributed control architecture for a thermal power plant. The conic toolset for building distributed systems. Network management issues in distributed control systems. Interprocessor communication system architecture in a distributed control system environment. Uni-level homogenous distributed computer control system and optimal system design. A-nets for DCCS design. A methodology for the specification and design of fault tolerant real time systems. An integrated computer control system - architecture design, engineering methodology and practical experience.

  9. Drinking Water Distribution Systems

    Science.gov (United States)

    Learn about an overview of drinking water distribution systems, the factors that degrade water quality in the distribution system, assessments of risk, future research about these risks, and how to reduce cross-connection control risk.

  10. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    This paper provides detailed insights into predictability of the entire stock and bond return distribution through the use of quantile regression. This allows us to examine speci…c parts of the return distribution such as the tails or the center, and for a suf…ciently …ne grid of quantiles we can...... are predictable as a function of economic state variables. The results are, however, very different for stocks and bonds. The state variables primarily predict only location shifts in the stock return distribution, while they also predict changes in higher-order moments in the bond return distribution. Out...... trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions...

  11. Unintegrated double parton distributions

    CERN Document Server

    Golec-Biernat, K

    2016-01-01

    We present the construction of unintegrated double parton distribution functions which include dependence on transverse momenta of partons. We extend the formulation which was used to obtain the single unintegrated parton distributions from the standard, integrated parton distribution functions. Starting from the homogeneous part of the evolution equations for the integrated double parton distributions, we construct the unintegrated double parton distributions as the convolutions of the integrated double distributions and the splitting functions, multiplied by the Sudakov form factors. We show that there exist three domains of external hard scales which require three distinct forms of the unintegrated double distributions. The additional transverse momentum dependence which arises through the Sudakov form factors leads to non-trivial correlations in the parton momenta. We also discuss the non-homogeneous contribution to the unintegrated double parton distributions, which arises due to the splitting of a singl...

  12. Intelligent Distributed Systems

    Science.gov (United States)

    2015-10-23

    AFRL-AFOSR-VA-TR-2016-0006 Intelligent Distributed Systems A Stephen Morse YALE UNIV NEW HAVEN CT Final Report 10/23/2015 DISTRIBUTION A...and D. Fullmer. A distributed algorithm for efficiently solving linear equations and its applications. System and Control Letters, 2015. submitted... Distribution approved for public release. AF Office Of Scientific Research (AFOSR)/ RTA2 Arlington, Virginia 22203 Air Force Research Laboratory Air Force

  13. Leadership for Distributed Teams

    NARCIS (Netherlands)

    De Rooij, J.P.G.

    2009-01-01

    The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an

  14. Leadership for Distributed Teams

    NARCIS (Netherlands)

    De Rooij, J.P.G.

    2009-01-01

    The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an interdepende

  15. Probability distributions for magnetotellurics

    Energy Technology Data Exchange (ETDEWEB)

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  16. Fire Regime in Marginal Jack Pine Populations at Their Southern Limit of Distribution, Riding Mountain National Park, Central Canada

    Directory of Open Access Journals (Sweden)

    Jacques C. Tardif

    2016-09-01

    Full Text Available In central Canada, long fire history reconstructions are rare. In a context where both anthropogenic and climate influences on fire regime have changed, Parks Canada has a mandate to maintain ecological integrity. Here we present a fire history derived from fire-scarred jack pine (Pinus banksiana Lamb. trees growing at their southern distribution limit in Riding Mountain National Park (RMNP. In Lake Katherine Fire Management Unit (LKFMU, a subregion within the park, fire history was reconstructed from archival records, tree-ring records, and charcoal in lake sediment. From about 1450 to 1850 common era (CE the fire return intervals varied from 37 to 125 years, according to models. During the period 1864–1930 the study area burned frequently (Weibull Mean Fire Intervals between 2.66 and 5.62 years; this period coincided with the end of First Nations occupation and the start of European settlement. Major recruitment pulses were associated with the stand-replacing 1864 and 1894 fires. This period nevertheless corresponded to a reduction in charcoal accumulation. The current fire-free period in LKFMU (1930–today coincides with RMNP establishment, exclusion of First Nations land use and increased fire suppression. Charcoal accumulation further decreased during this period. In the absence of fire, jack pine exclusion in LKFMU is foreseeable and the use of prescribed burning is advocated to conserve this protected jack pine ecosystem, at the southern margins of its range, and in the face of potential climate change.

  17. Regular variation, Paretian distributions, and the interplay of light and heavy tails in the fractality of asymptotic models

    Science.gov (United States)

    Pestana, Dinis D.; Aleixo, Sandra M.; Rocha, J. Leonel

    Classical central limit theorems, culminating in the theory of infinite divisibility, accurately describe the behaviour of stochastic phenomena with asymptotically negligible components. The classical theory fails when a single component may assume an extreme protagonism. The early developments of the speculation theory didn't incorporate the pioneer work of Pareto on heavy tailed models, and the proper setup to conciliate regularity and abrupt changes, in a wide range of natural phenomena, is Karamata's concept of regular variation and the role it plays in the theory of domains of attraction, [8], and Resnick's tail equivalence leading to the importance of generalized Pareto distribution is the scope of extreme value theory, [13]. Waliszewski and Konarski discussed the applicability of the Gompertz curve and its fractal behaviour for instance in modeling healthy and neoplasic cells tissue growth, [15]. Gompertz function is the Gumbel extreme value model, whose broad domain of attraction contains intermediate tail weight laws with a wide range of behaviour. Aleixo et al. investigated fractality associated with Beta (p,q) models, [1], [2], [10] and [11]. In this work, we introduce a new family of probability density functions tied to the classical beta family, the Beta*(p,q) models, some of which are generalized Pareto, that span the possible regular variation of tails. We extend the investigation to other extreme stable models, namely Fréchet's and Weibull's types in the General Extreme Value (GEV) model.

  18. Extreme value distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2016-01-01

    The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.

  19. Verification of LHS distributions.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  20. Distributed security in closed distributed systems

    DEFF Research Database (Denmark)

    Hernandez, Alejandro Mario

    reflected in heterogeneous security aims; the software life cycle entails evolution and this includes security expectations; the distribution is useful if the entire system is “open” to new (a priori unknown) interactions; the distribution itself poses intrinsically more complex security-related problems......, and aim at providing security to each of these individually. The approach taken is by means of access control enforcement mechanisms, providing security to the locations they are related to. We provide a framework for modelling so. All this follows techniques borrowed from the aspect-orientation community....... As this needs to be scaled up to the entire distributed system, we then focus on ways of reasoning about the resulting composition of these individual access control mechanisms. We show how, by means of relying on the semantics of our framework, we can syntactically guarantee some limited set of global security...

  1. Leadership for Distributed Teams

    OpenAIRE

    De Rooij, J.P.G.

    2009-01-01

    The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an interdependent task and in realizing a joint goal” (adapted from Bell & Kozlowski, 2002 and Dubé & Paré, 2004). Chapter 1 first presents the outline of the dissertation. Next, several characteristics of distri...

  2. Distributed generation systems model

    Energy Technology Data Exchange (ETDEWEB)

    Barklund, C.R.

    1994-12-31

    A slide presentation is given on a distributed generation systems model developed at the Idaho National Engineering Laboratory, and its application to a situation within the Idaho Power Company`s service territory. The objectives of the work were to develop a screening model for distributed generation alternatives, to develop a better understanding of distributed generation as a utility resource, and to further INEL`s understanding of utility concerns in implementing technological change.

  3. Deciding bisimilarities on distributions

    DEFF Research Database (Denmark)

    Eisentraut, Christian; Hermanns, Holger; Krämer, Julia

    2013-01-01

    Probabilistic automata (PA) are a prominent compositional concurrency model. As a way to justify property-preserving abstractions, in the last years, bisimulation relations over probability distributions have been proposed both in the strong and the weak setting. Different to the usual bisimulati...... is known so far. This paper presents an equivalent state-based reformulation for weak distribution bisimulation, rendering it amenable for algorithmic treatment. Then, decision procedures for the probability distribution-based bisimulation relations are presented....

  4. Sorting a distribution theory

    CERN Document Server

    Mahmoud, Hosam M

    2011-01-01

    A cutting-edge look at the emerging distributional theory of sorting Research on distributions associated with sorting algorithms has grown dramatically over the last few decades, spawning many exact and limiting distributions of complexity measures for many sorting algorithms. Yet much of this information has been scattered in disparate and highly specialized sources throughout the literature. In Sorting: A Distribution Theory, leading authority Hosam Mahmoud compiles, consolidates, and clarifies the large volume of available research, providing a much-needed, comprehensive treatment of the

  5. Parton Distribution Function Uncertainties

    CERN Document Server

    Giele, Walter T.; Kosower, David A.; Giele, Walter T.; Keller, Stephane A.; Kosower, David A.

    2001-01-01

    We present parton distribution functions which include a quantitative estimate of its uncertainties. The parton distribution functions are optimized with respect to deep inelastic proton data, expressing the uncertainties as a density measure over the functional space of parton distribution functions. This leads to a convenient method of propagating the parton distribution function uncertainties to new observables, now expressing the uncertainty as a density in the prediction of the observable. New measurements can easily be included in the optimized sets as added weight functions to the density measure. Using the optimized method nowhere in the analysis compromises have to be made with regard to the treatment of the uncertainties.

  6. Advanced air distribution

    DEFF Research Database (Denmark)

    Melikov, Arsen Krikor

    2011-01-01

    The aim of total volume air distribution (TVAD) involves achieving uniform temperature and velocity in the occupied zone and environment designed for an average occupant. The supply of large amounts of clean and cool air are needed to maintain temperature and pollution concentration at acceptable....... Ventilation in hospitals is essential to decrease the risk of airborne cross-infection. At present, mixing air distribution at a minimum of 12 ach is used in infection wards. Advanced air distribution has the potential to aid in achieving healthy, comfortable and productive indoor environments at levels...... higher than what can be achieved today with the commonly used total volume air distribution principles....

  7. Electric distribution systems

    CERN Document Server

    Sallam, A A

    2010-01-01

    "Electricity distribution is the penultimate stage in the delivery of electricity to end users. The only book that deals with the key topics of interest to distribution system engineers, Electric Distribution Systems presents a comprehensive treatment of the subject with an emphasis on both the practical and academic points of view. Reviewing traditional and cutting-edge topics, the text is useful to practicing engineers working with utility companies and industry, undergraduate graduate and students, and faculty members who wish to increase their skills in distribution system automation and monitoring."--

  8. Distributed Energy Technology Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Distributed Energy Technologies Laboratory (DETL) is an extension of the power electronics testing capabilities of the Photovoltaic System Evaluation Laboratory...

  9. Distributed Structure Searchable Toxicity

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Distributed Structure Searchable Toxicity (DSSTox) online resource provides high quality chemical structures and annotations in association with toxicity data....

  10. Smart Distribution Systems

    Directory of Open Access Journals (Sweden)

    Yazhou Jiang

    2016-04-01

    Full Text Available The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. A comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD, is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs is introduced. Future research in a smart distribution environment is proposed.

  11. Metrics for Food Distribution.

    Science.gov (United States)

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in food distribution, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  12. Epicentral distribution in 2007

    Institute of Scientific and Technical Information of China (English)

    CHEN Pei-shan

    2008-01-01

    @@ For showing the epicentral distribution in and near China as well as all over the world, two epicentral maps for the earthquakes occurred last year are published annually in the 6-th issue each year. Figures 1 and 2 represent the epicentral distributions in and near China and all over the World in 2007, Respectively.

  13. Epicentral distribution in 2004

    Institute of Scientific and Technical Information of China (English)

    CHEN Pei-shan

    2005-01-01

    @@ For showing the epicentral distribution in and near China as well as all over the world, two epicentral maps for the earthquakes occurred last year are published annually in the 6-th issue each year. Figures 1 and 2 represent the epicentral distributions in and near China and all over the world in 2004, respectively.

  14. Epicentral distribution in 2005

    Institute of Scientific and Technical Information of China (English)

    CHEN Pei-shan

    2006-01-01

    @@ For showing the epicentral distribution in and near China as well as all over the world, two epicentral maps for the earthquakes occurred last year are published annually in the 6-th issue each year. Figures 1 and 2 represent the epicentral distributions in and near China and all over the world in 2005, respectively.

  15. Epicentral distribution in 2006

    Institute of Scientific and Technical Information of China (English)

    CHEN Pei-shan

    2007-01-01

    @@ For showing the epicentral distribution in and near China as well as all over the world, two epicentral maps for the earthquakes occurred last year are published annually in the 6-th issue each year. Figures 1 and 2 represent the epicentral distributions in and near China and all over the world in 2006, respectively.

  16. Distribution Functions of Copulas

    Institute of Scientific and Technical Information of China (English)

    LI Yong-hong; He Ping

    2007-01-01

    A general method was proposed to evaluate the distribution function of 〈C1|C2〉 . Some examples were presented to validate the application of the method. Then the sufficient and necessary condition for that the distribution function ofis uniform was proved.

  17. Distributed Control Diffusion

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2007-01-01

    , self-reconfigurable robots, we present the concept of distributed control diffusion: distributed queries are used to identify modules that play a specific role in the robot, and behaviors that implement specific control strategies are diffused throughout the robot based on these role assignments...... perform simple obstacle avoidance in a wide range of different car-like robots constructed using ATRON modules...

  18. Advanced Distribution Management System

    Science.gov (United States)

    Avazov, Artur R.; Sobinova, Liubov A.

    2016-02-01

    This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.

  19. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  20. Property insurance loss distributions

    Science.gov (United States)

    Burnecki, Krzysztof; Kukla, Grzegorz; Weron, Rafał

    2000-11-01

    Property claim services (PCS) provides indices for losses resulting from catastrophic events in the US. In this paper, we study these indices and take a closer look at distributions underlying insurance claims. Surprisingly, the lognormal distribution seems to give a better fit than the Paretian one. Moreover, lagged autocorrelation study reveals a mean-reverting structure of indices returns.

  1. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  2. Distributed operating systems

    NARCIS (Netherlands)

    Mullender, Sape J.

    1987-01-01

    In the past five years, distributed operating systems research has gone through a consolidation phase. On a large number of design issues there is now considerable consensus between different research groups. In this paper, an overview of recent research in distributed systems is given. In turn, th

  3. Evaluating Distributed Timing Constraints

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems.......In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems....

  4. Quantum dense key distribution

    CERN Document Server

    Degiovanni, I P; Castelletto, S; Rastello, M L; Bovino, F A; Colla, A M; Castagnoli, G C

    2004-01-01

    This paper proposes a new protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than BB84 one. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility.

  5. Electrical Distribution Program Guide.

    Science.gov (United States)

    Georgia Univ., Athens. Dept. of Vocational Education.

    This program guide contains the standard electrical distribution curriculum for technical institutes in Georgia. The curriculum encompasses the minimum competencies required for entry-level workers in the electrical distribution field, and in job skills such as construction, maintenance, and repair of overhead and underground electrical…

  6. Groundwater and Distribution Workbook.

    Science.gov (United States)

    Ekman, John E.

    Presented is a student manual designed for the Wisconsin Vocational, Technical and Adult Education Groundwater and Distribution Training Course. This program introduces waterworks operators-in-training to basic skills and knowledge required for the operation of a groundwater distribution waterworks facility. Arranged according to the general order…

  7. Distributed System Evaluation

    Science.gov (United States)

    1990-07-01

    computers. If a distributed operating system is designed with aschrony in mind efficient usage of overall system resources can be employed through the...the complex problem of efficiently balancing CPU, disk, and communications resource usage in the distributed environment mast be solved by the...throughput (concuirent processing capability), survivability and availabilty , and finally interprocess communication. In measuning the concurrent

  8. Pore size distribution mapping

    OpenAIRE

    Strange, John H.; J. Beau W. WEBBER; Schmidt, S.D.

    1996-01-01

    Pore size distribution mapping has been demonstrated using NMR cryoporometry\\ud in the presence of a magnetic field gradient, This novel method is extendable to 2D and 3D mapping. It offers a unique nondestructive method of obtaining full pore-size distributions in the range 3 to 100 nm at any point within a bulk sample. \\ud

  9. Cache Oblivious Distribution Sweeping

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.

    2002-01-01

    We adapt the distribution sweeping method to the cache oblivious model. Distribution sweeping is the name used for a general approach for divide-and-conquer algorithms where the combination of solved subproblems can be viewed as a merging process of streams. We demonstrate by a series of algorithms...

  10. Sesame allergy threshold dose distribution.

    Science.gov (United States)

    Dano, D; Remington, B C; Astier, C; Baumert, J L; Kruizinga, A G; Bihain, B E; Taylor, S L; Kanny, G

    2015-09-01

    Sesame is a relevant food allergen in France. Compared to other allergens there is a lack of food challenge data and more data could help sesame allergy risk management. The aim of this study is to collect more sesame challenge data and investigate the most efficient food challenge method for future studies. Records of patients at University Hospital in Nancy (France) with objective symptoms to sesame challenges were collected and combined with previously published data. An estimation of the sesame allergy population threshold was calculated based on individual NOAELs and LOAELs. Clinical dosing schemes at Nancy were investigated to see if the optimal protocol for sesame is currently used. Fourteen patients (10 M/4 F, 22 ± 14.85 years old) with objective symptoms were added to previously published data making a total of 35 sesame allergic patients. The most sensitive patient reacted to the first dose at challenge of 1.02 mg sesame protein. The ED05 ranges between 1.2 and 4.0 mg of sesame protein (Log-Normal, Log-Logistic, and Weibull models) and the ED10 between 4.2 and 6.2 mg. The optimal food challenge dosing scheme for sesame follows semi-log dose increases from 0.3 to 3000 mg protein. This article provides a valuable update to the existing clinical literature regarding sesame NOAELs and LOAELs. Establishment of a population threshold for sesame could help in increasing the credibility of precautionary labelling and decrease the costs associated with unexpected allergic reactions. Also, the use of an optimal dosing scheme would decrease time spent on diagnostic and thereafter on the economic burden of sesame allergy diagnosis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Distributed Robotics Education

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept...... of a distributed educational system as a valuable tool for introducing students to interactive parallel and distributed processing programming as the foundation for distributed robotics and human-robot interaction development. This is done by providing an educational tool that enables problem representation...... to be changed, related to multirobot control and human-robot interaction control from virtual to physical representation. The proposed system is valuable for bringing a vast number of issues into education – such as parallel programming, distribution, communication protocols, master dependency, connectivity...

  12. Distributed plot-making

    DEFF Research Database (Denmark)

    Jensen, Lotte Groth; Bossen, Claus

    2016-01-01

    different socio-technical systems (paper-based and electronic patient records). Drawing on the theory of distributed cognition and narrative theory, primarily inspired by the work done within health care by Cheryl Mattingly, we propose that the creation of overview may be conceptualised as ‘distributed plot......-making’. Distributed cognition focuses on the role of artefacts, humans and their interaction in information processing, while narrative theory focuses on how humans create narratives through the plot construction. Hence, the concept of distributed plot-making highlights the distribution of information processing...... between different social actors and artefacts, as well as the filtering, sorting and ordering of such information into a narrative that is made coherent by a plot. The analysis shows that the characteristics of paper-based and electronic patient records support or obstruct the creation of overview in both...

  13. Fast Distributed Gradient Methods

    CERN Document Server

    Jakovetic, Dusan; Moura, Jose M F

    2011-01-01

    The paper proposes new fast distributed optimization gradient methods and proves convergence to the exact solution at rate O(\\log k/k), much faster than existing distributed optimization (sub)gradient methods with convergence O(1/\\sqrt{k}), while incurring practically no additional communication nor computation cost overhead per iteration. We achieve this for convex (with at least one strongly convex,) coercive, three times differentiable and with Lipschitz continuous first derivative (private) cost functions. Our work recovers for distributed optimization similar convergence rate gains obtained by centralized Nesterov gradient and fast iterative shrinkage-thresholding algorithm (FISTA) methods over ordinary centralized gradient methods. We also present a constant step size distributed fast gradient algorithm for composite non-differentiable costs. A simulation illustrates the effectiveness of our distributed methods.

  14. Development of distributed target

    CERN Document Server

    Yu Hai Jun; Li Qin; Zhou Fu Xin; Shi Jin Shui; Ma Bing; Chen Nan; Jing Xiao Bing

    2002-01-01

    Linear introduction accelerator is expected to generate small diameter X-ray spots with high intensity. The interaction of the electron beam with plasmas generated at the X-ray converter will make the spot on target increase with time and debase the X-ray dose and the imaging resolving power. A distributed target is developed which has about 24 pieces of thin 0.05 mm tantalum films distributed over 1 cm. due to the structure adoption, the distributed target material over a large volume decreases the energy deposition per unit volume and hence reduces the temperature of target surface, then reduces the initial plasma formalizing and its expansion velocity. The comparison and analysis with two kinds of target structures are presented using numerical calculation and experiments, the results show the X-ray dose and normalized angle distribution of the two is basically the same, while the surface of the distributed target is not destroyed like the previous block target

  15. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  16. Distributed Propulsion Vehicles

    Science.gov (United States)

    Kim, Hyun Dae

    2010-01-01

    Since the introduction of large jet-powered transport aircraft, the majority of these vehicles have been designed by placing thrust-generating engines either under the wings or on the fuselage to minimize aerodynamic interactions on the vehicle operation. However, advances in computational and experimental tools along with new technologies in materials, structures, and aircraft controls, etc. are enabling a high degree of integration of the airframe and propulsion system in aircraft design. The National Aeronautics and Space Administration (NASA) has been investigating a number of revolutionary distributed propulsion vehicle concepts to increase aircraft performance. The concept of distributed propulsion is to fully integrate a propulsion system within an airframe such that the aircraft takes full synergistic benefits of coupling of airframe aerodynamics and the propulsion thrust stream by distributing thrust using many propulsors on the airframe. Some of the concepts are based on the use of distributed jet flaps, distributed small multiple engines, gas-driven multi-fans, mechanically driven multifans, cross-flow fans, and electric fans driven by turboelectric generators. This paper describes some early concepts of the distributed propulsion vehicles and the current turboelectric distributed propulsion (TeDP) vehicle concepts being studied under the NASA s Subsonic Fixed Wing (SFW) Project to drastically reduce aircraft-related fuel burn, emissions, and noise by the year 2030 to 2035.

  17. Ruin Distributions and Their Equations

    Institute of Scientific and Technical Information of China (English)

    卢金余; 王汉兴; 赵飞

    2005-01-01

    In this paper, the ruin distributions were analyzed, Including the distribution of surplus immediately before ruin, the distribution of claim at the time of ruin, the distribution of deficit, and the distribution of surplus at the beginning of the claim period before ruin. Several Integral equations for the ruin distributions were derived and some solutions under special conditions were obtained.

  18. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  19. Distributions an outline

    CERN Document Server

    Marchand, Jean-Paul

    2007-01-01

    In a simple but mathematically coherent manner, this text examines the basis of the distribution theories devised by Schwartz and by Mikusinki. Rigorous and concise, it surveys the functional theory of distributions as well as the algebraic theory. Its easy generalizations offer applications to a wide variety of problems.The two-part treatment begins with the functional theory of distributions, exploring differentiation, formation of products, translation and regularization, convergence, Fourier transforms, and partial differential equations. The second half focuses on the algebraic theory of

  20. Electric power distribution handbook

    CERN Document Server

    Short, Thomas Allen

    2003-01-01

    Of the ...big three... components of the electricity infrastructure, distribution typically gets the least attention, and no thorough, up-to-date treatment of the subject has been published in years. Filling that void, the Electric Power Distribution Handbook provides comprehensive information on the electrical aspects of power distribution systems. It is an unparalleled source for the background information, hard-to-find tables, graphs, methods, and statistics that power engineers need, and includes tips and solutions for problem solving and improving performance. In short, this handbook giv

  1. Electric power distribution handbook

    CERN Document Server

    Short, Thomas Allen

    2014-01-01

    Of the ""big three"" components of electrical infrastructure, distribution typically gets the least attention. In fact, a thorough, up-to-date treatment of the subject hasn't been published in years, yet deregulation and technical changes have increased the need for better information. Filling this void, the Electric Power Distribution Handbook delivers comprehensive, cutting-edge coverage of the electrical aspects of power distribution systems. The first few chapters of this pragmatic guidebook focus on equipment-oriented information and applications such as choosing transformer connections,

  2. On the Conditional Distribution of the Multivariate $t$ Distribution

    OpenAIRE

    Ding, Peng

    2016-01-01

    As alternatives to the normal distributions, $t$ distributions are widely applied in robust analysis for data with outliers or heavy tails. The properties of the multivariate $t$ distribution are well documented in Kotz and Nadarajah's book, which, however, states a wrong conclusion about the conditional distribution of the multivariate $t$ distribution. Previous literature has recognized that the conditional distribution of the multivariate $t$ distribution also follows the multivariate $t$ ...

  3. Community Based Distribution

    African Journals Online (AJOL)

    Community Based Distribution (CBD) is a relatively new concept. It is a service that reaches ... neration; Resupply systems; Pricing of contraceptives; Mix of services ... tion on how best to design and implement the project and the community in ...

  4. Bearded Seal Distribution Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains GIS layers that depict the known spatial distributions (i.e., ranges) of the two subspecies of bearded seals (Erignathus barbatus). It was...

  5. Spotted Seal Distribution Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains GIS layers that depict the known spatial distributions (i.e., ranges) and reported breeding areas of spotted seals (Phoca largha). It was...

  6. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  7. Navigating Distributed Services

    DEFF Research Database (Denmark)

    Beute, Berco

    2002-01-01

    , to a situation where they are distributedacross the Internet. The second trend is the shift from a virtual environment that solelyconsists of distributed documents to a virtual environment that consists of bothdistributed documents and distributed services. The third and final trend is theincreasing diversity...... of devices used to access information on the Internet.The focal point of the thesis is an initial exploration of the effects of the trends onusers as they navigate the virtual environment of distributed documents and services.To begin the thesis uses scenarios as a heuristic device to identify and analyse...... themain effects of the trends. This is followed by an exploration of theory of navigationInformation Spaces, which is in turn followed by an overview of theories, and the stateof the art in navigating distributed services. These explorations of both theory andpractice resulted in a large number of topics...

  8. Projected Elliptical Distributions

    Institute of Scientific and Technical Information of China (English)

    Winfried Stute; Uwe Werner

    2005-01-01

    We introduce a new parametrization of elliptically contoured densities and study the associated family of projected (circular) distributions. In particular we investigate the trigonometric moments and some convolution properties.

  9. Ribbon Seal Distribution Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains GIS layers that depict the known spatial distributions (i.e., ranges) and reported breeding areas of ribbon seals (Histriophoca fasciata). It...

  10. Ringed Seal Distribution Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains GIS layers that depict the known spatial distributions (i.e., ranges) of the five subspecies of ringed seals (Phoca hispida). It was produced...

  11. DOLIB: Distributed Object Library

    Energy Technology Data Exchange (ETDEWEB)

    D`Azevedo, E.F.; Romine, C.H.

    1994-10-01

    This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.

  12. White Dwarf Mass Distribution

    CERN Document Server

    Kepler, S O; Romero, Alejandra Daniela; Ourique, Gustavo; Pelisoli, Ingrid

    2016-01-01

    We present the mass distribution for all S/N > 15 pure DA white dwarfs detected in the Sloan Digital Sky Survey up to Data Release 12, fitted with Koester models for ML2/alpha=0.8, and with Teff > 10 000 K, and for DBs with S/N >10, fitted with ML2/alpha=1.25, for Teff > 16 000 K. These mass distributions are for log g > 6.5 stars, i.e., excluding the Extremely Low Mass white dwarfs. We also present the mass distributions corrected by volume with the 1/Vmax approach, for stars brighter than g=19. Both distributions have a maximum at M=0.624 Msun but very distinct shapes. From the estimated z-distances, we deduce a disk scale height of 300 pc. We also present 10 probable halo white dwarfs, from their galactic U, V, W velocities.

  13. Quantum Key Distribution

    Science.gov (United States)

    Seshu, Ch.

    Quantum Key Distribution (QKD) uses Quantum Mechanics to guarantee secure communication. It enables two parties to produce a shared random bit string known only to them, which can be used as a key to encrypt and decrypt messages.

  14. AUTOMATED SOFTWARE DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    J.J. Strasheim

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Automated distribution of computer software via electronic means in large corporate networks is growing in popularity. The relative importance of personal computer software, in financial and logistical terms, is described and the developing need for automated software distribution explained. An actual comparitive example of alternative software distribution strategies is presented and discussed proving the viability of Electronic Software Distribution.

    AFRIKAANSE OPSOMMING: Geoutomatiseerde verspreiding van rekenaarprogrammatuur met behulp van elektroniese metodes in groot korporatiewe netwerke, is toenemend populer, Die relatiewe belangrikheid van persoonlike rekenaarprogrammatuur in finansiele en logistieke terme word bespreek en die groeiende behoefte na geoutomatiseerde programmatuurverspreiding verduidelik. 'n Werklike vergelykende voorbeeld van alternatiewe programmatuurverspreidingsstrategiee word aangebied en bespreek wat die lewensvatbaarheid van Elektroniese Programmatuurverspreiding bewys.

  15. Polygamy of distributed entanglement

    Science.gov (United States)

    Buscemi, Francesco; Gour, Gilad; Kim, Jeong San

    2009-07-01

    While quantum entanglement is known to be monogamous (i.e., shared entanglement is restricted in multipartite settings), here we show that distributed entanglement (or the potential for entanglement) is by nature polygamous. By establishing the concept of one-way unlocalizable entanglement (UE) and investigating its properties, we provide a polygamy inequality of distributed entanglement in tripartite quantum systems of arbitrary dimension. We also provide a polygamy inequality in multiqubit systems and several trade-offs between UE and other correlation measures.

  16. Agile & Distributed Project Management

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene

    2011-01-01

    Scrum has gained surprising momentum as an agile IS project management approach. An obvious question is why Scrum is so useful? To answer that question we carried out a longitudinal study of a distributed project using Scrum. We analyzed the data using coding and categorisation and three carefull...... and coordination mechanisms by allowing both local and global articulation of work in the project. That is why Scrum is especially useful for distributed IS project management and teamwork....

  17. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  18. Parton Distributions Working Group

    Energy Technology Data Exchange (ETDEWEB)

    de Barbaro, L.; Keller, S. A.; Kuhlmann, S.; Schellman, H.; Tung, W.-K.

    2000-07-20

    This report summarizes the activities of the Parton Distributions Working Group of the QCD and Weak Boson Physics workshop held in preparation for Run II at the Fermilab Tevatron. The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, the authors introduce a Manifesto that describes an optimal method for reporting data.

  19. Consistency in Distributed Systems

    OpenAIRE

    Kemme, Bettina; Ramalingam, Ganesan; Schiper, André; Shapiro, Marc; Vaswani, Kapil

    2013-01-01

    International audience; In distributed systems, there exists a fundamental trade-off between data consistency, availability, and the ability to tolerate failures. This trade-off has significant implications on the design of the entire distributed computing infrastructure such as storage systems, compilers and runtimes, application development frameworks and programming languages. Unfortunately, it also has significant, and poorly understood, implications for the designers and developers of en...

  20. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  1. Agile & Distributed Project Management

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene

    2011-01-01

    Scrum has gained surprising momentum as an agile IS project management approach. An obvious question is why Scrum is so useful? To answer that question we carried out a longitudinal study of a distributed project using Scrum. We analyzed the data using coding and categorisation and three carefully...... and coordination mechanisms by allowing both local and global articulation of work in the project. That is why Scrum is especially useful for distributed IS project management and teamwork....

  2. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  3. Moment Distributions of Phase Type

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    -type distributions. We construct representations for moment distributions based on a general matrix-exponential distribution which turns out to be a generalization of the moment distributions based on exponential distributions. For moment distributions based on phase{type distributions we find an appropriate...... alternative representation in terms of sub{intensity matrices. Finally we are able to nd explicit expressions for both the Lorenz curve and the Gini index....

  4. Distributed Tracking in Distributed Sensor Networks

    Science.gov (United States)

    1988-05-26

    9.199 19252.332 109.909 S GoalStats (un, us, vn. we) :1779.225 -2511.597 -7.426 55.158 U R 4= Type NIlL nods (weight 1.999) : Rainuth and Azinuth Rate 1...9.168 9.39 E Typo NIlL node (weight 1.999) 1Sound Pressure and Rate :19175.527 295.949 S FO. 14 170 IN 190 291 216 2M 236 245 258 I IM Figure 6-5...San Diego, California 92121 (619) 587-1121 I II Distributed Sensor Program U Final ReportI by: I- Jack K. Wolf Andrew J. Viterbi -- Greg P. Heinzinger

  5. Distribution System Pricing with Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Hledik, Ryan [The Brattle Group, Cambridge, MA (United States); Lazar, Jim [The Regulatory Assistance Project, Montpelier, VT (United States); Schwartz, Lisa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-08-16

    Technological changes in the electric utility industry bring tremendous opportunities and significant challenges. Customers are installing clean sources of on-site generation such as rooftop solar photovoltaic (PV) systems. At the same time, smart appliances and control systems that can communicate with the grid are entering the retail market. Among the opportunities these changes create are a cleaner and more diverse power system, the ability to improve system reliability and system resilience, and the potential for lower total costs. Challenges include integrating these new resources in a way that maintains system reliability, provides an equitable sharing of system costs, and avoids unbalanced impacts on different groups of customers, including those who install distributed energy resources (DERs) and low-income households who may be the least able to afford the transition.

  6. Analysis of meteorological droughts and dry spells in semiarid regions: a comparative analysis of probability distribution functions in the Segura Basin (SE Spain)

    Science.gov (United States)

    Pérez-Sánchez, Julio; Senent-Aparicio, Javier

    2017-08-01

    Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.

  7. Performance of soil particle-size distribution models for describing deposited soils adjacent to constructed dams in the China Loess Plateau

    Science.gov (United States)

    Zhao, Pei; Shao, Ming-an; Horton, Robert

    2011-02-01

    Soil particle-size distributions (PSD) have been used to estimate soil hydraulic properties. Various parametric PSD models have been proposed to describe the soil PSD from sparse experimental data. It is important to determine which PSD model best represents specific soils. Fourteen PSD models were examined in order to determine the best model for representing the deposited soils adjacent to dams in the China Loess Plateau; these were: Skaggs (S-1, S-2, and S-3), fractal (FR), Jaky (J), Lima and Silva (LS), Morgan (M), Gompertz (G), logarithm (L), exponential (E), log-exponential (LE), Weibull (W), van Genuchten type (VG) as well as Fredlund (F) models. Four-hundred and eighty samples were obtained from soils deposited in the Liudaogou catchment. The coefficient of determination (R 2), the Akaike's information criterion (AIC), and the modified AIC (mAIC) were used. Based upon R 2 and AIC, the three- and four-parameter models were both good at describing the PSDs of deposited soils, and the LE, FR, and E models were the poorest. However, the mAIC in conjunction with R 2 and AIC results indicated that the W model was optimum for describing PSD of the deposited soils for emphasizing the effect of parameter number. This analysis was also helpful for finding out which model is the best one. Our results are applicable to the China Loess Plateau.

  8. Loss optimization in distribution networks with distributed generation

    DEFF Research Database (Denmark)

    Pokhrel, Basanta Raj; Nainar, Karthikeyan; Bak-Jensen, Birgitte

    2017-01-01

    This paper presents a novel power loss minimization approach in distribution grids considering network reconfiguration, distributed generation and storage installation. Identification of optimum configuration in such scenario is one of the main challenges faced by distribution system operators...

  9. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  10. Distributed Radio Interferometric Calibration

    CERN Document Server

    Yatawatta, Sarod

    2015-01-01

    Increasing data volumes delivered by a new generation of radio interferometers require computationally efficient and robust calibration algorithms. In this paper, we propose distributed calibration as a way of improving both computational cost as well as robustness in calibration. We exploit the data parallelism across frequency that is inherent in radio astronomical observations that are recorded as multiple channels at different frequencies. Moreover, we also exploit the smoothness of the variation of calibration parameters across frequency. Data parallelism enables us to distribute the computing load across a network of compute agents. Smoothness in frequency enables us reformulate calibration as a consensus optimization problem. With this formulation, we enable flow of information between compute agents calibrating data at different frequencies, without actually passing the data, and thereby improving robustness. We present simulation results to show the feasibility as well as the advantages of distribute...

  11. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  12. Online Distributed Sensor Selection

    CERN Document Server

    Golovin, Daniel; Krause, Andreas

    2010-01-01

    A key problem in sensor networks is to decide which sensors to query when, in order to obtain the most useful information (e.g., for performing accurate prediction), subject to constraints (e.g., on power and bandwidth). In many applications the utility function is not known a priori, must be learned from data, and can even change over time. Furthermore for large sensor networks solving a centralized optimization problem to select sensors is not feasible, and thus we seek a fully distributed solution. In this paper, we present Distributed Online Greedy (DOG), an efficient, distributed algorithm for repeatedly selecting sensors online, only receiving feedback about the utility of the selected sensors. We prove very strong theoretical no-regret guarantees that apply whenever the (unknown) utility function satisfies a natural diminishing returns property called submodularity. Our algorithm has extremely low communication requirements, and scales well to large sensor deployments. We extend DOG to allow observatio...

  13. Electricity Distribution Effectiveness

    Directory of Open Access Journals (Sweden)

    Waldemar Szpyra

    2015-12-01

    Full Text Available This paper discusses the basic concepts of cost accounting in the power industry and selected ways of assessing the effectiveness of electricity distribution. The results of effectiveness analysis of MV/LV distribution transformer replacement are presented, and unit costs of energy transmission through various medium-voltage line types are compared. The calculation results confirm the viability of replacing transformers manufactured before 1975. Replacing transformers manufactured after 1975 – only to reduce energy losses – is not economically justified. Increasing use of a PAS type line for energy transmission in local distribution networks is reasonable. Cabling these networks under the current calculation rules of discounts for excessive power outages is not viable, even in areas particularly exposed to catastrophic wire icing.

  14. Distributed Wind Market Applications

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, T.; Baring-Gould, I.

    2007-11-01

    Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, central station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.

  15. Distributed Web Service Repository

    Directory of Open Access Journals (Sweden)

    Piotr Nawrocki

    2015-01-01

    Full Text Available The increasing availability and popularity of computer systems has resulted in a demand for new, language- and platform-independent ways of data exchange. That demand has in turn led to a significant growth in the importance of systems based on Web services. Alongside the growing number of systems accessible via Web services came the need for specialized data repositories that could offer effective means of searching of available services. The development of mobile systems and wireless data transmission technologies has allowed the use of distributed devices and computer systems on a greater scale. The accelerating growth of distributed systems might be a good reason to consider the development of distributed Web service repositories with built-in mechanisms for data migration and synchronization.

  16. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  17. A distribution network review

    Energy Technology Data Exchange (ETDEWEB)

    Fairbairn, R.J.; Maunder, D.; Kenyon, P.

    1999-07-01

    This report summarises the findings of a study reviewing the distribution network in England, Scotland and Wales to evaluate its ability to accommodate more embedded generation from both fossil fuel and renewable energy sources. The background to the study is traced, and descriptions of the existing electricity supply system, the licence conditions relating to embedded generation, and the effects of the Review of Electricity Trading Arrangements are given. The ability of the UK distribution networks to accept embedded generation is examined, and technical benefits/drawbacks arising from embedded generation, and the potential for uptake of embedded generation technologies are considered. The distribution network capacity and the potential uptake of embedded generation are compared, and possible solutions to overcome obstacles are suggested. (UK)

  18. Inferring the eccentricity distribution

    CERN Document Server

    Hogg, David W; Bovy, Jo

    2010-01-01

    Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual-star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementation of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision--other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, parallaxes, or photometr...

  19. Discrete Pearson distributions

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, K.O. [Oak Ridge National Lab., TN (United States); Shenton, L.R. [Georgia Univ., Athens, GA (United States); Kastenbaum, M.A. [Kastenbaum (M.A.), Basye, VA (United States)

    1991-11-01

    These distributions are generated by a first order recursive scheme which equates the ratio of successive probabilities to the ratio of two corresponding quadratics. The use of a linearized form of this model will produce equations in the unknowns matched by an appropriate set of moments (assumed to exist). Given the moments we may find valid solutions. These are two cases; (1) distributions defined on the non-negative integers (finite or infinite) and (2) distributions defined on negative integers as well. For (1), given the first four moments, it is possible to set this up as equations of finite or infinite degree in the probability of a zero occurrence, the sth component being a product of s ratios of linear forms in this probability in general. For (2) the equation for the zero probability is purely linear but may involve slowly converging series; here a particular case is the discrete normal. Regions of validity are being studied. 11 refs.

  20. Distributed photovoltaic grid transformers

    CERN Document Server

    Shertukde, Hemchandra Madhusudan

    2014-01-01

    The demand for alternative energy sources fuels the need for electric power and controls engineers to possess a practical understanding of transformers suitable for solar energy. Meeting that need, Distributed Photovoltaic Grid Transformers begins by explaining the basic theory behind transformers in the solar power arena, and then progresses to describe the development, manufacture, and sale of distributed photovoltaic (PV) grid transformers, which help boost the electric DC voltage (generally at 30 volts) harnessed by a PV panel to a higher level (generally at 115 volts or higher) once it is

  1. Market Sentiments Distribution Law

    Directory of Open Access Journals (Sweden)

    Jorge Reyes-Molina

    2016-09-01

    Full Text Available The Stock Exchange is basically ruled by the extreme market sentiments of euphoria and fear. The type of sentiment is given by the color of the candlestick (white = bullish sentiments, black = bearish sentiments, meanwhile the intensity of the sentiment is given by the size of it. In this paper you will see that the intensity of any sentiment is astonishingly distributed in a robust, systematic and universal way, according to a law of exponential decay, the conclusion of which is supported by the analysis of the Lyapunov exponent, the information entropy and the frequency distribution of candlestick size.

  2. Nuclear parton distributions

    Directory of Open Access Journals (Sweden)

    Kulagin S. A.

    2017-01-01

    Full Text Available We review a microscopic model of the nuclear parton distribution functions, which accounts for a number of nuclear effects including Fermi motion and nuclear binding, nuclear meson-exchange currents, off-shell corrections to bound nucleon distributions and nuclear shadowing. We also discuss applications of this model to a number of processes including lepton-nucleus deep inelastic scattering, proton-nucleus Drell-Yan lepton pair production at Fermilab, as well as W± and Z0 boson production in proton-lead collisions at the LHC.

  3. Liquidity, welfare and distribution

    Directory of Open Access Journals (Sweden)

    Martín Gil Samuel

    2012-01-01

    Full Text Available This work presents a dynamic general equilibrium model where wealth distribution is endogenous. I provide channels of causality that suggest a complex relationship between financial markets and the real activity which breaks down the classical dichotomy. As a consequence, the Friedman rule does not hold. In terms of the current events taking place in the world economy, this paper provides a rationale to advert against the perils of an economy satiated with liquidity. Efficiency and distribution cannot thus be considered as separate attributes once we account for the interactions between financial markets and the economic performance.

  4. Theory of distributions

    CERN Document Server

    Georgiev, Svetlin G

    2015-01-01

    This book explains many fundamental ideas on the theory of distributions. The theory of partial differential equations is one of the synthetic branches of analysis that combines ideas and methods from different fields of mathematics, ranging from functional analysis and harmonic analysis to differential geometry and topology. This presents specific difficulties to those studying this field. This book, which consists of 10 chapters, is suitable for upper undergraduate/graduate students and mathematicians seeking an accessible introduction to some aspects of the theory of distributions. It can also be used for one-semester course.

  5. THERMAL DISTRIBUTION SYSTEM EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    KRAJEWSKI,R.F.; ANDREWS,J.W.; WEI,G.

    1999-09-01

    A laboratory experiment has been conducted which tests for the effects of distribution system purging on system Delivery Effectiveness (DE) as defined in ASHRAE 152P. The experiment is described in its configuration, instrumentation, and data acquisition system. Data gathered in the experiment is given and discussed. The results show that purging of the distribution system alone does not offer any improvement of the system DE. Additional supporting tests were conducted regarding experimental simulations of buffer zones and bare pipe and are also discussed.

  6. A Distributed Tier-1

    DEFF Research Database (Denmark)

    Fischer, Lars; Grønager, Michael; Kleist, Josva

    2008-01-01

    The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single...... organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring....

  7. Distributed User Interfaces

    CERN Document Server

    Gallud, Jose A; Penichet, Victor M R

    2011-01-01

    The recent advances in display technologies and mobile devices is having an important effect on the way users interact with all kinds of devices (computers, mobile devices, laptops, tablets, and so on). These are opening up new possibilities for interaction, including the distribution of the UI (User Interface) amongst different devices, and implies that the UI can be split and composed, moved, copied or cloned among devices running the same or different operating systems. These new ways of manipulating the UI are considered under the emerging topic of Distributed User Interfaces (DUIs). DUIs

  8. 76 FR 42768 - Capital Distribution

    Science.gov (United States)

    2011-07-19

    ... Office of Thrift Supervision Capital Distribution AGENCY: Office of Thrift Supervision (OTS), Treasury... concerning the following information collection. Title of Proposal: Capital Distribution. OMB Number: 1550..., the information provides the OTS with a mechanism for monitoring capital distributions since...

  9. Ground Wood Fiber Length Distributions

    OpenAIRE

    Lauri Ilmari Salminen; Sari Liukkonen; Alava, Mikko J.

    2014-01-01

    This study considers ground wood fiber length distributions arising from pilot grindings. The empirical fiber length distributions appear to be independent of wood fiber length as well as feeding velocity. In terms of mathematics the fiber fragment distributions of ground wood pulp combine an exponential distribution for high-length fragments and a power-law distribution for smaller lengths. This implies that the fiber length distribution is influenced by the stone surface. A fragmentation-ba...

  10. Distributed analysis in ATLAS

    CERN Document Server

    Legger, Federica; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  11. Presenting Distributive Laws

    NARCIS (Netherlands)

    Bonsangue, M.M.; Hansen, H.H.; Kurz, A.; Rot, J.C.; Heckel, R.; Milius, S.

    2013-01-01

    Distributive laws of a monad over a functor F are categorical tools for specifying algebra-coalgebra interaction. They proved to be important for solving systems of corecursive equations, for the specification of well-behaved structural operational semantics and, more recently, also for enhancements

  12. Presenting Distributive Laws

    NARCIS (Netherlands)

    Bonsangue, M.M.; Hansen, H.H.; Kurz, A.; Rot, J.C.

    2015-01-01

    Distributive laws of a monad T over a functor F are categorical tools for specifying algebra-coalgebra interaction. They proved to be important for solving systems of corecursive equations, for the specication of well-behaved structural operational se- mantics and, more recently, also for enhancemen

  13. Presenting distributive laws

    NARCIS (Netherlands)

    Bonsangue, M.M.; Hansen, H.H.; Kurz, A.; Rot, J.

    2015-01-01

    Distributive laws of a monad T over a functor F are categorical tools for specifying algebra-coalgebra interaction. They proved to be important for solving systems of corecursive equations, for the specification of well-behaved structural operational semantics and, more recently, also for enhancemen

  14. Parton Distributions Working Group

    CERN Document Server

    de Barbaro, Lucy; Brock, R.; Casey, D.; Demina, R.; Giele, W.T.; Hirosky, R.; Huston, J.; Kalk, J.; Keller, S.A.; Klasen, M.; Kosower, D.A.; Kramer, M.; Kretzer, S.; Kuhlmann, S.; Martin, R.; Olness, Fredrick I.; Plehn, T.; Pumplin, J.; Scalise, R.J.; Schellman, H.; Smith, J.; Soper, D.E.; Sterman, George F.; Stump, D.; Tung, W.K.; Varelas, N.; Vogelsang, W.; Yang, Un-Ki

    2000-01-01

    The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, we introduce a "Manifesto" that describes an optimal method for reporting data.

  15. Spherical distributions : Schoenberg revisited

    NARCIS (Netherlands)

    Steerneman, AGM; van Perlo-ten Kleij, F

    2005-01-01

    An in-dimensional random vector X is said to have a spherical distribution if and only if its characteristic function is of the form phi(parallel to t parallel to), where t is an element of R-m, parallel to.parallel to denotes the usual Euclidean norm, and phi is a characteristic function on R. A mo

  16. Distributed debugging and tumult

    NARCIS (Netherlands)

    Scholten, J.; Jansen, P.G.

    1990-01-01

    A description is given of Tumult (Twente university multicomputer) and its operating system, along with considerations about parallel debugging, examples of parallel debuggers, and the proposed debugger for Tumult. Problems related to debugging distributed systems and solutions found in other distri

  17. Parallel and Distributed Databases

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Kemper, Alfons; Prieto, Manuel; Szalay, Alex

    2009-01-01

    Euro-Par Topic 5 addresses data management issues in parallel and distributed computing. Advances in data management (storage, access, querying, retrieval, mining) are inherent to current and future information systems. Today, accessing large volumes of information is a reality: Data-intensive appli

  18. Computing compound distributions faster

    NARCIS (Netherlands)

    P. den Iseger; M.A.J. Smith; R. Dekker (Rommert)

    1997-01-01

    textabstractThe use of Panjer's algorithm has meanwhile become a widespread standard technique for actuaries (Kuon et al., 1955). Panjer's recursion formula is used for the evaluation of compound distributions and can be applied to life and general insurance problems. The discrete version of Panjer'

  19. Distributed usability evaluation

    DEFF Research Database (Denmark)

    Christensen, Lars; Frøkjær, Erik

    2010-01-01

    We present DUE (Distributed Usability Evaluation), a technique for collecting and evaluating usability data. The DUE infrastructure involves a client-server network. A client-based tool resides on the workstation of each user, providing a screen video recording, microphone input of voice commenta...

  20. Highly interactive distributed visualization

    NARCIS (Netherlands)

    Scarpa, M.; Belleman, R.G.; Sloot, P.M.A.; de Laat, C.T.A.M.

    2006-01-01

    We report on our iGrid2005 demonstration, called the "Dead Cat Demo"; an example of a highly interactive augmented reality application consisting of software services distributed over a wide-area, high-speed network. We describe our design decisions, analyse the implications of the design on applica

  1. Factor Determining Income Distribution

    NARCIS (Netherlands)

    J. Tinbergen (Jan)

    1972-01-01

    textabstractSince the phrase income distribution covers a large number of different concepts, it is necessary to define these and to indicate the choice made in this article. Income for a given recipient may cover lists of items which are not always the same. Apart from popular misunderstandings abo

  2. A Distributed Magnetometer Network

    CERN Document Server

    Scoville, John; Freund, Friedemann

    2014-01-01

    Various possiblities for a distributed magnetometer network are considered. We discuss strategies such as croudsourcing smartphone magnetometer data, the use of trees as magnetometers, and performing interferometry using magnetometer arrays to synthesize the magnetometers into the world's largest telescope. Geophysical and other applications of such a network are discussed.

  3. Distributed Treatment Systems.

    Science.gov (United States)

    Zgonc, David; Baideme, Matthew

    2015-10-01

    This section presents a review of the literature published in 2014 on topics relating to distributed treatment systems. This review is divided into the following sections with multiple subsections under each: constituent removal; treatment technologies; and planning and treatment system management.

  4. Spatial distribution of overtopping

    NARCIS (Netherlands)

    Lioutas, A.; Smith, G.M.; Verhagen, H.J.

    2012-01-01

    The scope of this research is to find an empirical formula to describe the distribution of wave overtopping in the region behind the crest. A physical model was set up in which irregular waves were generated. In order to find a formula which adequately describes the test observations, the influence

  5. Income distribution: Second thoughts

    NARCIS (Netherlands)

    J. Tinbergen (Jan)

    1977-01-01

    textabstractAs a follow-up of his book on income distribution the author reformulates his version on the scarcity theory of income from productive contributions. The need to introduce into an earnings theory several job characteristics, non-cognitive as well as cognitive, and the corresponding perso

  6. Distributed XML Design

    CERN Document Server

    Abiteboul, S; Manna, M; 10.1145/1559795.1559833

    2010-01-01

    A distributed XML document is an XML document that spans several machines. We assume that a distribution design of the document tree is given, consisting of an XML kernel-document T[f1,...,fn] where some leaves are "docking points" for external resources providing XML subtrees (f1,...,fn, standing, e.g., for Web services or peers at remote locations). The top-down design problem consists in, given a type (a schema document that may vary from a DTD to a tree automaton) for the distributed document, "propagating" locally this type into a collection of types, that we call typing, while preserving desirable properties. We also consider the bottom-up design which consists in, given a type for each external resource, exhibiting a global type that is enforced by the local types, again with natural desirable properties. In the article, we lay out the fundamentals of a theory of distributed XML design, analyze problems concerning typing issues in this setting, and study their complexity.

  7. Polarized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    de Florian, D.; Epele, L.N.; Fanchiotti, H.; Garcia Canal, C.A.; Sassot, R. (Laboratorio de Fisica Teorica, Departamento de Fisica, Universidad Nacional de La Plata, C.C. 67-1900 La Plata (Argentina))

    1995-01-01

    We analyze spin-dependent parton distributions consistent with the most recent measurements of the spin-dependent deep inelastic scattering structure functions and obtained in the framework of the spin dilution model. Predictions for the doubly polarized proton-proton Drell-Yan asymmetry, for the high [ital p][sub [ital T

  8. Intelligent Distributed Control

    Science.gov (United States)

    2012-08-10

    Transactions on Automatic Control , 2012. in preparation. [19] C. Yu, B. D. O. Anderson, and A. S. Morse. Gossiping periodically. IEEE Transactions on Automatic Control , 2011...Adaptive Control and Signal Processing, 2011. submitted. [18] J. Liu and A. S. Morse. Distributed averaging via double linear iterations. IEEE

  9. Aerosol distribution apparatus

    Science.gov (United States)

    Hanson, W.D.

    An apparatus for uniformly distributing an aerosol to a plurality of filters mounted in a plenum, wherein the aerosol and air are forced through a manifold system by means of a jet pump and released into the plenum through orifices in the manifold. The apparatus allows for the simultaneous aerosol-testing of all the filters in the plenum.

  10. Power distribution arrangement

    DEFF Research Database (Denmark)

    2010-01-01

    An arrangement and a method for distributing power supplied by a power source to two or more of loads (e.g., electrical vehicular systems) is disclosed, where a representation of the power taken by a particular one of the loads from the source is measured. The measured representation of the amount...

  11. Air Distribution in Rooms

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The research on air distribution in rooms is often done as full-size investigations, scale-model investigations or by Computational Fluid Dynamics (CFD). New activities have taken place within all three areas and this paper draws comparisons between the different methods. The outcome of the l...

  12. Multiagent distributed watershed management

    Science.gov (United States)

    Giuliani, M.; Castelletti, A.; Amigoni, F.; Cai, X.

    2012-04-01

    Deregulation and democratization of water along with increasing environmental awareness are challenging integrated water resources planning and management worldwide. The traditional centralized approach to water management, as described in much of water resources literature, is often unfeasible in most of the modern social and institutional contexts. Thus it should be reconsidered from a more realistic and distributed perspective, in order to account for the presence of multiple and often independent Decision Makers (DMs) and many conflicting stakeholders. Game theory based approaches are often used to study these situations of conflict (Madani, 2010), but they are limited to a descriptive perspective. Multiagent systems (see Wooldridge, 2009), instead, seem to be a more suitable paradigm because they naturally allow to represent a set of self-interested agents (DMs and/or stakeholders) acting in a distributed decision process at the agent level, resulting in a promising compromise alternative between the ideal centralized solution and the actual uncoordinated practices. Casting a water management problem in a multiagent framework allows to exploit the techniques and methods that are already available in this field for solving distributed optimization problems. In particular, in Distributed Constraint Satisfaction Problems (DCSP, see Yokoo et al., 2000), each agent controls some variables according to his own utility function but has to satisfy inter-agent constraints; while in Distributed Constraint Optimization Problems (DCOP, see Modi et al., 2005), the problem is generalized by introducing a global objective function to be optimized that requires a coordination mechanism between the agents. In this work, we apply a DCSP-DCOP based approach to model a steady state hypothetical watershed management problem (Yang et al., 2009), involving several active human agents (i.e. agents who make decisions) and reactive ecological agents (i.e. agents representing

  13. A distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P.; Verho, P.; Kaerenlampi, M.; Pitkaenen, M. [Tampere Univ. of Technology (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The development of new distribution automation applications is considerably wide nowadays. One of the most interesting areas is the development of a distribution management system (DMS) as an expansion to the traditional SCADA system. At the power transmission level such a system is called an energy management system (EMS). The idea of these expansions is to provide supporting tools for control center operators in system analysis and operation planning. Nowadays the SCADA is the main computer system (and often the only) in the control center. However, the information displayed by the SCADA is often inadequate, and several tasks cannot be solved by a conventional SCADA system. A need for new computer applications in control center arises from the insufficiency of the SCADA and some other trends. The latter means that the overall importance of the distribution networks is increasing. The slowing down of load-growth has often made network reinforcements unprofitable. Thus the existing network must be operated more efficiently. At the same time larger distribution areas are for economical reasons being monitored at one control center and the size of the operation staff is decreasing. The quality of supply requirements are also becoming stricter. The needed data for new applications is mainly available in some existing systems. Thus the computer systems of utilities must be integrated. The main data source for the new applications in the control center are the AM/FM/GIS (i.e. the network database system), the SCADA, and the customer information system (CIS). The new functions can be embedded in some existing computer system. This means a strong dependency on the vendor of the existing system. An alternative strategy is to develop an independent system which is integrated with other computer systems using well-defined interfaces. The latter approach makes it possible to use the new applications in various computer environments, having only a weak dependency on the

  14. Summer Steelhead Distribution [ds341

    Data.gov (United States)

    California Department of Resources — Summer Steelhead Distribution October 2009 Version This dataset depicts observation-based stream-level geographic distribution of anadromous summer-run steelhead...

  15. Pulse Distributing Manifold; Pulse Distributing Manifold

    Energy Technology Data Exchange (ETDEWEB)

    Schutting, Eberhard [Technische Univ. Graz (Austria); Sams, Theodor [AVL List GmbH, Graz (Austria); Glensvig, Michael [Forschungsgesellschaft mbH, Graz (AT). Kompetenzzentrum ' ' Das virtuelle Fahrzeug' ' (VIF)

    2011-07-01

    The Pulse Distributing Manifold is a new charge exchange method for turbocharged diesel engines with exhaust gas recirculation (EGR). The method is characterized in that the EGR mass flow is not diverted from the exhaust gas mass flow continuously, but over time broken into sub-streams. The temporal interruption is achieved by two phase-shifted outlet valves which are connected via separate manifolds only with the turbocharger or only with the EGR path. The time points of valve opening are chosen such that the turbocharger and the aftertreatment process of exhaust gas is perfused by high-energy exhaust gas of the blowdown phase while cooler and less energy-rich exhaust gas of the exhaust period is used for the exhaust gas recirculation. This increases the enthalpy for the turbocharger and the temperature for the exhaust gas treatment, while the cooling efficiency at the EGR cooler is reduced. The elimination of the continuous EGR valve has a positive effect on pumping losses. The principle functioning and the potential of this system could be demonstrated by means of a concept study using one-dimensional simulations. Without disadvantages in fuel consumption for the considered commercial vehicle engine, a reduction the EGR cooler performance by 15 % and an increase in exhaust temperature of 35 K could be achieved. The presented charge exchange method was developed, evaluated and patented within the scope of the research program 'K2-mobility' of the project partners AVL (Mainz, Federal Republic of Germany) and University of Technology Graz (Austria). The research project 'K2-Mobility' is supported by the competence center 'The virtual vehicle' Forschungsgesellschaft mbH (Graz, Austria).

  16. Distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Verho, P.; Kaerenlampi, M.; Pitkaenen, M.; Jaerventausta, P.; Partanen, J.

    1997-12-31

    This report comprises a general description of the results obtained in the research projects `Information system applications of a distribution control center`, `Event analysis in primary substation`, and `Distribution management system` of the EDISON research program during the years of 1993 - 1997. The different domains of the project are presented in more detail in other reports. An operational state analysis of a distribution network has been made from the control center point of view and the functions which can not be solved by a conventional SCADA system are determined. The basis for new computer applications is shown to be integration of the computer systems. The main result of the work is a distribution management system (DMS), which is an autonomous system integrated to the existing information systems, SCADA and AM/FM/GIS. The system uses a large number of modelling and computation methods and provides an extensive group of advanced functions to support the distribution network monitoring, fault management, operations planning and optimization. The development platform of the system consists of a Visual C++ programming environment, Windows NT operating system and PC. During the development the DMS has been tested in a pilot utility and it is nowadays in practical use in several Finnish utilities. The use of a DMS improves the quality and economy of power supply in many ways; the outage times can, in particular, be reduced using the system. Based on the achieved experiences some parts of the DMS reached the commercialization phase, too. Initially the commercial products were developed by a software company, Versoft Oy. At present the research results are the basis of a worldwide software product supplied by ABB Transmit Co. (orig.) EDISON Research Programme. 28 refs.

  17. Distribution view: a tool to write and simulate distributions

    OpenAIRE

    Coelho, José; Branco, Fernando; Oliveira, Teresa

    2006-01-01

    In our work we present a tool to write and simulate distributions. This tool allows to write mathematical expressions which can contain not only functions and variables, but also statistical distributions, including mixtures. Each time the expression is evaluated, for all inner distributions, is generated a value according to the distribution and is used for expression value determination. The inversion method can be used in this language, allowing to generate all distributions...

  18. Distributed Visualization Project

    Science.gov (United States)

    Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca

    2016-01-01

    Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.

  19. Distributed Parameter Modelling Applications

    DEFF Research Database (Denmark)

    2011-01-01

    Here the issue of distributed parameter models is addressed. Spatial variations as well as time are considered important. Several applications for both steady state and dynamic applications are given. These relate to the processing of oil shale, the granulation of industrial fertilizers and the d......Here the issue of distributed parameter models is addressed. Spatial variations as well as time are considered important. Several applications for both steady state and dynamic applications are given. These relate to the processing of oil shale, the granulation of industrial fertilizers...... sands processing. The fertilizer granulation model considers the dynamics of MAP-DAP (mono and diammonium phosphates) production within an industrial granulator, that involves complex crystallisation, chemical reaction and particle growth, captured through population balances. A final example considers...

  20. Distributed Project Work

    DEFF Research Database (Denmark)

    Borch, Ole; Kirkegaard, B.; Knudsen, Morten

    1998-01-01

    Project work has been used for many years at Aalborg University to improve learning of theory and methods given in courses. In a closed environment where the students are forming a group in a single room, the interaction behaviour is more or less given from the natural life. Group work...... in a distributed fashion over the Internet needs more attention to the interaction protocol since the physical group room is not existing. The purpose in this paper is to develop a method for online project work by using the product: Basic Support for Cooperative Work (BSCV). An analysis of a well-proven protocol...... to be very precises and with success used on the second test group. Distributed project work is coming pretty soon and with little improvement in server tools, projects in different topics with a large and inhomogeneous profile of users are realistic....