WorldWideScience

Sample records for single-modal weibull distribution

  1. A CLASS OF WEIGHTED WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Saman Shahbaz

    2010-07-01

    Full Text Available The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied. The weighted Weibull model is proposed following the method of Azzalini (1985. Basic properties of the distribution; including moments, generating function, hazard rate function and estimation of parameters; have been studied.

  2. (AJST) MULTIPLE DEFECT DISTRIBUTIONS ON WEIBULL ...

    African Journals Online (AJOL)

    Weibull model for fatigue life distributions of unfiltered and filtered castings. Statistical parameters. Old oxide film defects. (o). Slip mechanisms. (s). Young and old oxide film defects (o). Pores attached to oxide films. (p). Characteristics fatigue life,. N c (cycles). 3.8 x 10. 5. 5.3 x 10. 6. 3.3 x 10. 5. 2.3 x 10. 5. Weibull modulus, b.

  3. Bayesian estimation of Weibull distribution parameters

    International Nuclear Information System (INIS)

    Bacha, M.; Celeux, G.; Idee, E.; Lannoy, A.; Vasseur, D.

    1994-11-01

    In this paper, we expose SEM (Stochastic Expectation Maximization) and WLB-SIR (Weighted Likelihood Bootstrap - Sampling Importance Re-sampling) methods which are used to estimate Weibull distribution parameters when data are very censored. The second method is based on Bayesian inference and allow to take into account available prior informations on parameters. An application of this method, with real data provided by nuclear power plants operation feedback analysis has been realized. (authors). 8 refs., 2 figs., 2 tabs

  4. Transmuted New Generalized Inverse Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Shuaib Khan

    2017-06-01

    Full Text Available This paper introduces the transmuted new generalized inverse Weibull distribution by using the quadratic rank transmutation map (QRTM scheme studied by Shaw et al. (2007. The proposed model contains the twenty three lifetime distributions as special sub-models. Some mathematical properties of the new distribution are formulated, such as quantile function, Rényi entropy, mean deviations, moments, moment generating function and order statistics. The method of maximum likelihood is used for estimating the model parameters. We illustrate the flexibility and potential usefulness of the new distribution by using reliability data.

  5. Inference for the Weibull Distribution: A tutorial

    Directory of Open Access Journals (Sweden)

    Scholz, F. W.

    2015-10-01

    Full Text Available This tutorial deals with the 2-parameter Weibull distribution. In particular it covers the construction of confidence bounds and intervals for various parameters of interest, the Weibull scale and shape parameters, its quantiles and tail probabilities. These bounds were pioniered in Thoman, Bain, and Antle, 1969, Thoman, Bain, and Antle, 1970, Bain, 1978, and Bain and Engelhardt, 1991, where tables for their computation were given. These tables were based on simulations and show occasional irregularities. In conjunction with this tutorial we provide R code to perform various tasks (generating plots, performsimulations. It greatly simplifies the application of thesemethods over trying to use the tables available so far. Today’s computing availability and speed makes this very viable. For the freely available R computing platformwe refer to R Core Team, 2015. The text identifies R code by using courier font in appropriate places.

  6. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  7. The Multivariate Order Statistics for Exponential and Weibull Distributions

    Directory of Open Access Journals (Sweden)

    Mariyam Hafeez

    2014-09-01

    Full Text Available In this paper we have derived the distribution of multivariate order statistics for multivariate exponential & multivariate weibull distribution. The moment expression for multivariate order statistics has also been derived.

  8. A Weibull distribution accrual failure detector for cloud computing.

    Science.gov (United States)

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  9. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  10. Multiple defect distributions on weibull statistical analysis of fatigue ...

    African Journals Online (AJOL)

    By relaxing the assumptions of a single cast defect distribution, of uniformity throughout the material and of uniformity from specimen to specimen, Weibull statistical analysis for multiple defect distributions have been applied to correctly describe the fatigue life data of aluminium alloy castings having multiple cast defects ...

  11. Comparing normal, lognormal and Weibull distributions for fitting ...

    African Journals Online (AJOL)

    Statistical probability density functions are widely used to model tree diameter distributions and to describe stand structure. The objective of this study was to compare the performance of normal, logarithmic-normal and threeparameter Weibull distributions for fitting diameter data from Akashmoni (Acacia auriculiformis A.

  12. Wind climate modeling using Weibull and extreme value distribution ...

    African Journals Online (AJOL)

    The expected number of stress cycles in the projected working life of a structure is related to the expected number of hours in the critical wind speed range and wind climate modelling is required to know this. The most popular model for this purpose is Weibull distribution. Again, wind energy is proportional to the cube of the ...

  13. comparison of estimation methods for fitting weibull distribution

    African Journals Online (AJOL)

    Tersor

    method was more accurate in fitting the Weibull distribution to the natural stand. It had the smallest .... were computed from the inventory data: mean diameter, minimum diameter, maximum diameter, number of trees per hectare and basal area. The summary statistics of the .... result with non-linear regression approach.

  14. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  15. Analysis of the upper-truncated Weibull distribution for wind speed

    International Nuclear Information System (INIS)

    Kantar, Yeliz Mert; Usta, Ilhan

    2015-01-01

    Highlights: • Upper-truncated Weibull distribution is proposed to model wind speed. • Upper-truncated Weibull distribution nests Weibull distribution as special case. • Maximum likelihood is the best method for upper-truncated Weibull distribution. • Fitting accuracy of upper-truncated Weibull is analyzed on wind speed data. - Abstract: Accurately modeling wind speed is critical in estimating the wind energy potential of a certain region. In order to model wind speed data smoothly, several statistical distributions have been studied. Truncated distributions are defined as a conditional distribution that results from restricting the domain of statistical distribution and they also cover base distribution. This paper proposes, for the first time, the use of upper-truncated Weibull distribution, in modeling wind speed data and also in estimating wind power density. In addition, a comparison is made between upper-truncated Weibull distribution and well known Weibull distribution using wind speed data measured in various regions of Turkey. The obtained results indicate that upper-truncated Weibull distribution shows better performance than Weibull distribution in estimating wind speed distribution and wind power. Therefore, upper-truncated Weibull distribution can be an alternative for use in the assessment of wind energy potential

  16. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  17. Reliability Implications in Wood Systems of a Bivariate Gaussian-Weibull Distribution and the Associated Univariate Pseudo-truncated Weibull

    Science.gov (United States)

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2014-01-01

    Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...

  18. Improvement for Amelioration Inventory Model with Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Han-Wen Tuan

    2017-01-01

    Full Text Available Most inventory models dealt with deteriorated items. On the contrary, just a few papers considered inventory systems under amelioration environment. We study an amelioration inventory model with Weibull distribution. However, there are some questionable results in the amelioration paper. We will first point out those questionable results in the previous paper that did not derive the optimal solution and then provide some improvements. We will provide a rigorous analytical work for different cases dependent on the size of the shape parameter. We present a detailed numerical example for different ranges of the sharp parameter to illustrate that our solution method attains the optimal solution. We developed a new amelioration model and then provided a detailed analyzed procedure to find the optimal solution. Our findings will help researchers develop their new inventory models.

  19. Fuzzification of the Distributed Activation Energy Model Using the Fuzzy Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Alok Dhaundiyal

    2018-01-01

    Full Text Available This study focuses on the influence of some of the relevant parameters of biomass pyrolysis on a fuzzified solution of the Distributed Activation Energy Model (DAEM due to randomness and inaccuracy of data. The study investigates the fuzzified Distributed Activation Energy Model using the fuzzy Weibull distribution. The activation energy, frequency factor, and distribution variables of the 3-parameter Weibull analysis are converted into a non-crisp set. The expression for the fuzzy sets, and their α-cut are discussed with an initial distribution for the activation energies following the Weibull distribution function. The thermo-analytical data for pine needles is used to illustrate the methodology to exhibit the fuzziness of some of the parameters relevant to biomass pyrolysis.

  20. Significance Test of Reliability Evaluation with Three-parameter Weibull Distribution Based on Grey Relational Analysis

    OpenAIRE

    Xintao Xia; Yantao Shang; Yinping Jin; Long Chen

    2013-01-01

    With the aid of the grey system theory, the grey relational analysis of the reliability with the three-parameter Weibull distribution is made for the Weibull parameter evaluation and its significance test. Via the theoretical value set and the experimental value set of the reliability relied on the lifetime data of a product, the model of the constrained optimization of the Weibull parameter evaluation based on the maximum grey relational grade. The grey significance of the reliability functi...

  1. A Study on The Mixture of Exponentiated-Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Adel Tawfik Elshahat

    2016-12-01

    Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.

  2. Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions

    Science.gov (United States)

    2015-12-01

    In fact, Type IIA, IIB, and IID Gamma size distributions are nearly identical except for the smallest fragments . Figure 1 shows an example where...log-log plane, for very large fragments . As noted earlier, Type I and III Weibull distributions are identical to each other. Similarly, Type II...Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions Distribution Statement A. Approved for public

  3. Weibull distribution and the multiplicity moments in p p (p p¯) collisions

    Science.gov (United States)

    Pandey, Ashutosh Kumar; Sett, Priyanka; Dash, Sadhana

    2017-10-01

    A higher moment analysis of multiplicity distribution is performed using the Weibull description of particle production in p p (p p ¯) collisions at Super Proton Synchrotron (SPS) and LHC energies. The calculated normalized moments and factorial moments of Weibull distribution are compared to the measured data. The calculated Weibull moments are found to be in good agreement with the measured higher moments (up to fifth order) reproducing the observed breaking of Koba, Nielsen, and Olesen scaling in the data. The moments for p p collisions at √{s }=13 TeV are also predicted.

  4. On alternative q-Weibull and q-extreme value distributions: Properties and applications

    Science.gov (United States)

    Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin

    2018-01-01

    Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.

  5. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    Directory of Open Access Journals (Sweden)

    B.B. Sagar

    2016-09-01

    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  6. Errors in wind resource and energy yield assessments based on the Weibull distribution

    Science.gov (United States)

    Jourdier, Bénédicte; Drobinski, Philippe

    2017-05-01

    The methodology used in wind resource assessments often relies on modeling the wind-speed statistics using a Weibull distribution. In spite of its common use, this distribution has been shown to not always accurately model real wind-speed distributions. Very few studies have examined the arising errors in power outputs, using either observed power productions or theoretical power curves. This article focuses on France, using surface wind measurements at 89 locations covering all regions of the country. It investigates how statistical modeling using a Weibull distribution impacts the prediction of the wind energy content and of the power output in the context of an annual energy production assessment. For this purpose it uses a plausible power curve adapted to each location. Three common methods for fitting the Weibull distribution are tested (maximum likelihood, first and third moments, and the Wind Atlas Analysis and Application Program (WAsP) method). The first two methods generate large errors in the production (mean absolute error around 5 %), especially in the southern areas where the goodness of fit of the Weibull distribution is poorer. The production is mainly overestimated except at some locations with bimodal wind distributions. With the third method, the errors are much lower at most locations (mean absolute error around 2 %). Another distribution, a mixed Rayleigh-Rice distribution, is also tested and shows better skill at assessing the wind energy yield.

  7. Calculation of life distributions, in particular Weibull distributions, from operational observations

    International Nuclear Information System (INIS)

    Rauhut, J.

    1982-01-01

    Established methods are presented by which life distributions of machine elements can be determined on the basis of laboratory experiments and operational observations. Practical observations are given special attention as the results estimated on the basis of conventional have not been accurate enough. As an introduction, the stochastic life concept, the general method of determining life distributions, various sampling methods, and the Weibull distribution are explained. Further, possible life testing schedules and maximum-likelihood estimates are discussed for the complete sample case and for censered sampling without replacement in laboratory experiments. Finally, censered sampling with replacement in laboratory experiments is discussed; it is shown how suitable parameter estimates can be obtained for given life distributions by means of the maximum-likelihood method. (orig./RW) [de

  8. Exponentiated Weibull distribution family under aperture averaging for Gaussian beam waves.

    Science.gov (United States)

    Barrios, Ricardo; Dios, Federico

    2012-06-04

    Nowadays, the search for a distribution capable of modeling the probability density function (PDF) of irradiance data under all conditions of atmospheric turbulence in the presence of aperture averaging still continues. Here, a family of PDFs alternative to the widely accepted Log-Normal and Gamma-Gamma distributions is proposed to model the PDF of the received optical power in free-space optical communications, namely, the Weibull and the exponentiated Weibull (EW) distribution. Particularly, it is shown how the proposed EW distribution offers an excellent fit to simulation and experimental data under all aperture averaging conditions, under weak and moderate turbulence conditions, as well as for point-like apertures. Another very attractive property of these distributions is the simple closed form expression of their respective PDF and cumulative distribution function.

  9. Inference on the reliability of Weibull distribution with multiply Type-I censored data

    International Nuclear Information System (INIS)

    Jia, Xiang; Wang, Dong; Jiang, Ping; Guo, Bo

    2016-01-01

    In this paper, we focus on the reliability of Weibull distribution under multiply Type-I censoring, which is a general form of Type-I censoring. In multiply Type-I censoring in this study, all units in the life testing experiment are terminated at different times. Reliability estimation with the maximum likelihood estimate of Weibull parameters is conducted. With the delta method and Fisher information, we propose a confidence interval for reliability and compare it with the bias-corrected and accelerated bootstrap confidence interval. Furthermore, a scenario involving a few expert judgments of reliability is considered. A method is developed to generate extended estimations of reliability according to the original judgments and transform them to estimations of Weibull parameters. With Bayes theory and the Monte Carlo Markov Chain method, a posterior sample is obtained to compute the Bayes estimate and credible interval for reliability. Monte Carlo simulation demonstrates that the proposed confidence interval outperforms the bootstrap one. The Bayes estimate and credible interval for reliability are both satisfactory. Finally, a real example is analyzed to illustrate the application of the proposed methods. - Highlights: • We focus on reliability of Weibull distribution under multiply Type-I censoring. • The proposed confidence interval for the reliability is superior after comparison. • The Bayes estimates with a few expert judgements on reliability are satisfactory. • We specify the cases where the MLEs do not exist and present methods to remedy it. • The distribution of estimate of reliability should be used for accurate estimate.

  10. Use of Gumbel and Weibull functions to model extreme values of diameter distributions in forest stands

    OpenAIRE

    Gorgoso-Varela, J. Javier; Rojo-Alboreca, Alberto

    2014-01-01

    International audience; & Context Families of the Gumbel (type I), Fréchet (type II) and Weibull (type III) distributions can be combined in the generalized extreme value (GEV) family of distributions. Maximum and minimum values of diameters in forest stands can be used in forest modelling, mainly to define parameters of the functions used in diameter class models as well as in some practical cases, such as modelling maximum diameters for sawing and processing purposes. & Aims The purpose of ...

  11. Probabilistic analysis of glass elements with three-parameter Weibull distribution; Analisis probabilistico de elementos de vidrio recocido mediante una distribucion triparametrica Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-10-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  12. Dyke thicknesses follow a Weibull distribution controlled by host-rock strength and magmatic overpressure

    Science.gov (United States)

    Krumbholz, M.; Hieronymus, C.; Burchardt, S.; Troll, V. R.; Tanner, D. C.; Friese, N.

    2012-04-01

    Dykes are the primary transport channels of magma through the crust and form large parts of volcanic edifices and the oceanic crust. Their dimensions are primary parameters that control magma transport rates and therefore influence, e.g. the size of fissure eruptions and crustal growth. Since the mechanics of dyke emplacement are essentially similar and independent of the tectonic setting, dyke properties should generally follow the same statistical laws. The measurement of dyke thicknesses is, of all parameters, least affected by censoring and truncation effects and therefore most accurately accessible. Nevertheless, dyke thicknesses have been ascribed to follow many different statistical distributions, such as negative exponential and power law. We tested large datasets of dyke thicknesses from different tectonic settings (mid-ocean ridge, oceanic intra-plate) for different statistical distributions (log-normal, exponential, power law (with fixed or variable lower cut-off), Rayleigh, Chi-square, and Weibull). For this purpose, we first converted the probability density functions of each dataset to cumulative distribution functions, thus avoiding arbitrariness in bin size. A non-linear, least-squares fit was then used to compute the parameter(s) of the distribution function. The goodness-of-fit was evaluated using three methods: (1) the residual sum of squares, (2) the Kolmogorov-Smirnov statistics, and (3) p-values using 10,000 synthetic datasets. The results show that, in general, dyke thickness is best described by a Weibull distribution. This suggests material strength is a function of the dimensions of included weaknesses (e.g. fractures), following the "weakest link of a chain" principle. Our datasets may be further subdivided according to dyke lithology (magma type) and type (regional dyke vs. inclined sheet), which leads to an increasingly better fit of the Weibull distribution. Weibull is hence the statistical distribution that universally describes dyke

  13. Etude de la vitesse du vent avec la distribution de weibull : cas de la ...

    African Journals Online (AJOL)

    Dans le but d'améliorer la production de l'énergie électrique et surtout d'explorer des sources complémentaires aux sources d'énergies existantes, nous avons choisi d'étudier la vitesse du vent au Bénin. Cette étude a été faite avec la distribution de probabilité de Weibull dans le but de calculer la puissance éolienne ...

  14. Evaluating wind energy potential in Gorgan–Iran using two methods of Weibull distribution function

    Directory of Open Access Journals (Sweden)

    Mehdi Hashemi-Tilehnoee

    2016-02-01

    Full Text Available In this study, wind energy characteristics of the, a city in northeast of Iran, measured at 10m height in 2014. The Gorgan airport one hour recorded data extrapolated to 50m height. The data have been statistically analyzed hourly, daily, monthly, seasonally and annually to determine the wind power potential. Weibull distribution function has been used to determine the wind power density and then the potential energy. Standard deviation method and power density method are the methods used to calculate the scaling and shaping parameters of the Weibull distribution function. The annual mean wind power calculated by the standard deviation method and the power density method is 38.98w/m2 and 41.32w/m2, respectively. By comparing the results concluded that the power density method is a better method than the standard deviation method. In addition, Gorgan wind energy potentiality categorized into class 1. So is unsuitable to utilize large wind energy turbine. Article History: Received November 21, 2015; Received in revised form January 15, 2016; Accepted February 10, 2016; Available online How to Cite This Article: Babayani, D., Khaleghi, M., Tashakor, S., and Hashemi-Tilehnoee.,M. (2016 Evaluating wind energy potential in Gorgan–Iran using two methods of Weibull distribution function. Int. Journal of Renewable Energy Development, 5(1, 43-48. http://dx.doi.org/10.14710/ijred.5.1.43-48 

  15. Statistical Analysis and Prediction on Tensile Strength of 316L-SS Joints at High Temperature Based on Weibull Distribution

    Science.gov (United States)

    An, Z. L.; Chen, T.; Cheng, D. L.; Chen, T. H.; Y Wang, Z.

    2017-12-01

    In this work, the prediction on average tensile strength of 316L stainless steel is statistically analyzed by Weibull distribution method. Direct diffusion bonding of 316L-SS was performed at high temperature of 550°C and 8 tension tests were carried out. The results obtained vary between 87.8MPa and 160.8MPa. The probability distribution of material failure is obtained by using the Weibull distribution.

  16. Anomalous diffusion and q-Weibull velocity distributions in epithelial cell migration.

    Directory of Open Access Journals (Sweden)

    Tatiane Souza Vilela Podestá

    Full Text Available In multicellular organisms, cell motility is central in all morphogenetic processes, tissue maintenance, wound healing and immune surveillance. Hence, the control of cell motion is a major demand in the creation of artificial tissues and organs. Here, cell migration assays on plastic 2D surfaces involving normal (MDCK and tumoral (B16F10 epithelial cell lines were performed varying the initial density of plated cells. Through time-lapse microscopy quantities such as speed distributions, velocity autocorrelations and spatial correlations, as well as the scaling of mean-squared displacements were determined. We find that these cells exhibit anomalous diffusion with q-Weibull speed distributions that evolves non-monotonically to a Maxwellian distribution as the initial density of plated cells increases. Although short-ranged spatial velocity correlations mark the formation of small cell clusters, the emergence of collective motion was not observed. Finally, simulational results from a correlated random walk and the Vicsek model of collective dynamics evidence that fluctuations in cell velocity orientations are sufficient to produce q-Weibull speed distributions seen in our migration assays.

  17. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  18. Comparison of Wind Energy Generation Using the Maximum Entropy Principle and the Weibull Distribution Function

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib

    2016-10-01

    Full Text Available Proper knowledge of the wind characteristics of a site is of fundamental importance in estimating wind energy output from a selected wind turbine. The present paper focuses on assessing the suitability and accuracy of the fitted distribution function to the measured wind speed data for Baburband site in Sindh Pakistan. Comparison is made between the wind power densities obtained using the fitted functions based on Maximum Entropy Principle (MEP and Weibull distribution. In case of MEP-based function a system of (N+1 non-linear equations containing (N+1 Lagrange multipliers is defined as probability density function. The maximum entropy probability density functions is calculated for 3–9 low order moments obtained from measured wind speed data. The annual actual wind power density (PA is found to be 309.25 W/m2 while the Weibull based wind power density (PW is 297.25 W/m2. The MEP-based density for orders 5, 7, 8 and 9 (PE is 309.21 W/m2, whereas for order 6 it is 309.43 W/m2. To validate the MEP-based function, the results are compared with the Weibull function and the measured data. Kolmogorov–Smirnov test is performed between the cdf of the measured wind data and the fitted distribution function (Q95 = 0.01457 > Q = 10−4. The test confirms the suitability of MEP-based function for modeling measured wind speed data and for the estimation of wind energy output from a wind turbine. R2 test is also performed giving analogous behavior of the fitted MEP-based pdf to the actual wind speed data (R2 ~ 0.9. The annual energy extracted using the chosen wind turbine based on Weibull function is PW = 2.54 GWh and that obtained using MEP-based function is PE = 2.57–2.67 GWh depending on the order of moments.

  19. Analysis of wind speed distributions: Wind distribution function derived from minimum cross entropy principles as better alternative to Weibull function

    International Nuclear Information System (INIS)

    Kantar, Yeliz Mert; Usta, Ilhan

    2008-01-01

    In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately

  20. A robust approach based on Weibull distribution for clustering gene expression data

    Directory of Open Access Journals (Sweden)

    Gong Binsheng

    2011-05-01

    Full Text Available Abstract Background Clustering is a widely used technique for analysis of gene expression data. Most clustering methods group genes based on the distances, while few methods group genes according to the similarities of the distributions of the gene expression levels. Furthermore, as the biological annotation resources accumulated, an increasing number of genes have been annotated into functional categories. As a result, evaluating the performance of clustering methods in terms of the functional consistency of the resulting clusters is of great interest. Results In this paper, we proposed the WDCM (Weibull Distribution-based Clustering Method, a robust approach for clustering gene expression data, in which the gene expressions of individual genes are considered as the random variables following unique Weibull distributions. Our WDCM is based on the concept that the genes with similar expression profiles have similar distribution parameters, and thus the genes are clustered via the Weibull distribution parameters. We used the WDCM to cluster three cancer gene expression data sets from the lung cancer, B-cell follicular lymphoma and bladder carcinoma and obtained well-clustered results. We compared the performance of WDCM with k-means and Self Organizing Map (SOM using functional annotation information given by the Gene Ontology (GO. The results showed that the functional annotation ratios of WDCM are higher than those of the other methods. We also utilized the external measure Adjusted Rand Index to validate the performance of the WDCM. The comparative results demonstrate that the WDCM provides the better clustering performance compared to k-means and SOM algorithms. The merit of the proposed WDCM is that it can be applied to cluster incomplete gene expression data without imputing the missing values. Moreover, the robustness of WDCM is also evaluated on the incomplete data sets. Conclusions The results demonstrate that our WDCM produces clusters

  1. Bayesian Approach for Constant-Stress Accelerated Life Testing for Kumaraswamy Weibull Distribution with Censoring

    Directory of Open Access Journals (Sweden)

    Abeer Abd-Alla EL-Helbawy

    2016-09-01

    Full Text Available The accelerated life tests provide quick information on the life time distributions by testing materials or products at higher than basic conditional levels of stress such as pressure, high temperature, vibration, voltage or load to induce failures. In this paper, the acceleration model assumed is log linear model. Constant stress tests are discussed based on Type I and Type II censoring. The Kumaraswmay Weibull distribution is used. The estimators of the parameters, reliability, hazard rate functions and p-th percentile at normal condition, low stress, and high stress are obtained. In addition, credible intervals for parameters of the models are constructed. Optimum test plan are designed. Some numerical studies are used to solve the complicated integrals such as Laplace and Markov Chain Monte Carlo methods.

  2. Dependence of Weibull distribution parameters on the CNR threshold i wind lidar data

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph

    2015-01-01

    The increase in height and area swept by the blades of wind turbines that harvest energy from the air flow in the lower atmosphere have raised a need for better understanding of the structure of the profiles of the wind, its gusts and the monthly to annual long-term, statistical distribution...... in the boundary layer. Observations from tall towers in combination with observations from a lidar of wind speed up to 600 m are used to study the long-term variability of the wind profile over sub-urban, rural, coastal and marine areas. The variability is expressed in terms of the shape parameter in the Weibull...... distribution. When the lidar Carrier to Noise Ratio (CNR) is lower than a threshold value the observations are often not used as the uncertainty on the wind speed of the lidar measurements increases. This analysis shows that the mean wind speed is a function of the applied CNR threshold, which indicates...

  3. The discrete additive Weibull distribution: A bathtub-shaped hazard for discontinuous failure data

    International Nuclear Information System (INIS)

    Bebbington, Mark; Lai, Chin-Diew; Wellington, Morgan; Zitikis, Ričardas

    2012-01-01

    Although failure data are usually treated as being continuous, they may have been collected in a discrete manner, or in fact be discrete in nature. Reliability models with bathtub-shaped hazard rate are fundamental to the concepts of burn-in and maintenance, but how well do they incorporate discrete data? We explore discrete versions of the additive Weibull distribution, which has the twin virtues of mathematical tractability and the ability to produce bathtub-shaped hazard rate functions. We derive conditions on the parameters for the hazard rate function to be increasing, decreasing, or bathtub shaped. While discrete versions may have the same shaped hazard rate for the same parameter values, we find that when fitted to data the fitted hazard rate shapes can vary between versions. Our results are illustrated using several real-life data sets, and the implications of using continuous models for discrete data discussed.

  4. Optimal pricing and lot-sizing decisions under Weibull distribution deterioration and trade credit policy

    Directory of Open Access Journals (Sweden)

    Manna S.K.

    2008-01-01

    Full Text Available In this paper, we consider the problem of simultaneous determination of retail price and lot-size (RPLS under the assumption that the supplier offers a fixed credit period to the retailer. It is assumed that the item in stock deteriorates over time at a rate that follows a two-parameter Weibull distribution and that the price-dependent demand is represented by a constant-price-elasticity function of retail price. The RPLS decision model is developed and solved analytically. Results are illustrated with the help of a base example. Computational results show that the supplier earns more profits when the credit period is greater than the replenishment cycle length. Sensitivity analysis of the solution to changes in the value of input parameters of the base example is also discussed.

  5. Wind Speed Analysis using Weibull Distribution in the Region Blang Bintang Aceh Besar

    Directory of Open Access Journals (Sweden)

    Khairiaton Khairiaton

    2016-11-01

    Study of the wind speeds in the region Blang Bintang, Aceh Besar district has been done to asses the potential of wind power instalation. The wind speed data was obtained from anemometer which has been instaling in that area. The datas were analyze by the Weibull distribution within the range for the years of  2012 to 2015. The results show that the shape parameter (k is small, the value is around 1.4 and the scale parameter (c tends to be stable, within the value of 4. Based on the value of k and c give that the wind speed in 2012 is equal to 1 ms-1 with a probability of 15%, in 2013 and 2014give the same value at 0.5 ms-1 with a probability of 21% and 19%, respectively while for 2015 is 1 ms-1 as much as 17%.

  6. A Weibull distribution with power-law tails that describes the first passage time processes of foreign currency exchanges

    Science.gov (United States)

    Sazuka, Naoya; Inoue, Jun-Ichi

    2007-03-01

    A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.

  7. Weibull Distribution for Estimating the Parameters and Application of Hilbert Transform in case of a Low Wind Speed at Kolaghat

    Directory of Open Access Journals (Sweden)

    P Bhattacharya

    2016-09-01

    Full Text Available The wind resource varies with of the day and the season of the year and even some extent from year to year. Wind energy has inherent variances and hence it has been expressed by distribution functions. In this paper, we present some methods for estimating Weibull parameters in case of a low wind speed characterization, namely, shape parameter (k, scale parameter (c and characterize the discrete wind data sample by the discrete Hilbert transform. We know that the Weibull distribution is an important distribution especially for reliability and maintainability analysis. The suitable values for both shape parameter and scale parameters of Weibull distribution are important for selecting locations of installing wind turbine generators. The scale parameter of Weibull distribution also important to determine whether a wind farm is good or not. Thereafter the use of discrete Hilbert transform (DHT for wind speed characterization provides a new era of using DHT besides its application in digital signal processing. Basically in this paper, discrete Hilbert transform has been applied to characterize the wind sample data measured on College of Engineering and Management, Kolaghat, East Midnapore, India in January 2011.

  8. Two Types of Distributed CFAR Detection Based on Weighting Functions in Fusion Center for Weibull Clutter

    Directory of Open Access Journals (Sweden)

    Amir Zaimbashi

    2013-01-01

    Full Text Available Two types of distributed constant false alarm rate (CFAR detection using binary and fuzzy weighting functions in fusion center are developed. In the two types of distributed detectors, it was assumed that the clutter parameters at the local sensors are unknown and each local detector performs CFAR processing based on ML and OS CFAR processors before transmitting data to the fusion center. At the fusion center, received data is weighted either by a binary or a fuzzy weighting functions and combined according to deterministic rules, constructing global test statistics. Moreover, for the Weibull clutter, the expression of the weighting functions, based on ML and OS CFAR processors in local detectors, is obtained. In the binary type, we analyzed various distributed detection schemes based on maximum, minimum, and summation rules in fusion center. In the fuzzy type, we consider the various distributed detectors based on algebraic product, algebraic sum, probabilistic OR, and Lukasiewicz t-conorm fuzzy rules in fusion center. The performance of the two types of distributed detectors is analyzed and compared in the homogenous and nonhomogenous situations, multiple targets, or clutter edge. The simulation results indicate the superiority and robust performance of fuzzy type in homogenous and non homogenous situations.

  9. Accurate bearing remaining useful life prediction based on Weibull distribution and artificial neural network

    Science.gov (United States)

    Ben Ali, Jaouher; Chebel-Morello, Brigitte; Saidi, Lotfi; Malinowski, Simon; Fnaiech, Farhat

    2015-05-01

    Accurate remaining useful life (RUL) prediction of critical assets is an important challenge in condition based maintenance to improve reliability and decrease machine's breakdown and maintenance's cost. Bearing is one of the most important components in industries which need to be monitored and the user should predict its RUL. The challenge of this study is to propose an original feature able to evaluate the health state of bearings and to estimate their RUL by Prognostics and Health Management (PHM) techniques. In this paper, the proposed method is based on the data-driven prognostic approach. The combination of Simplified Fuzzy Adaptive Resonance Theory Map (SFAM) neural network and Weibull distribution (WD) is explored. WD is used just in the training phase to fit measurement and to avoid areas of fluctuation in the time domain. SFAM training process is based on fitted measurements at present and previous inspection time points as input. However, the SFAM testing process is based on real measurements at present and previous inspections. Thanks to the fuzzy learning process, SFAM has an important ability and a good performance to learn nonlinear time series. As output, seven classes are defined; healthy bearing and six states for bearing degradation. In order to find the optimal RUL prediction, a smoothing phase is proposed in this paper. Experimental results show that the proposed method can reliably predict the RUL of rolling element bearings (REBs) based on vibration signals. The proposed prediction approach can be applied to prognostic other various mechanical assets.

  10. Average capacity for optical wireless communication systems over exponentiated Weibull distribution non-Kolmogorov turbulent channels.

    Science.gov (United States)

    Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

    2014-06-20

    We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.

  11. Power Loss Analysis for Wind Power Grid Integration Based on Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Al Ameri

    2017-04-01

    Full Text Available The growth of electrical demand increases the need of renewable energy sources, such as wind energy, to meet that need. Electrical power losses are an important factor when wind farm location and size are selected. The capitalized cost of constant power losses during the life of a wind farm will continue to high levels. During the operation period, a method to determine if the losses meet the requirements of the design is significantly needed. This article presents a Simulink simulation of wind farm integration into the grid; the aim is to achieve a better understanding of wind variation impact on grid losses. The real power losses are set as a function of the annual variation, considering a Weibull distribution. An analytical method has been used to select the size and placement of a wind farm, taking into account active power loss reduction. It proposes a fast linear model estimation to find the optimal capacity of a wind farm based on DC power flow and graph theory. The results show that the analytical approach is capable of predicting the optimal size and location of wind turbines. Furthermore, it revealed that the annual variation of wind speed could have a strong effect on real power loss calculations. In addition to helping to improve utility efficiency, the proposed method can develop specific designs to speeding up integration of wind farms into grids.

  12. MEP family of wind speed distribution function and comparison with the empirical Weibull distribution. Paper no. IGEC-1-156

    International Nuclear Information System (INIS)

    Li, M.; Li, X.

    2005-01-01

    The probabilistic distribution of wind speed is one of the important wind characteristics for the assessment of wind energy potential and for the performance of wind energy conversion systems, as well as for the structural and environmental design and analysis. In this study, an exponential family of distribution functions has been developed for the description of the probabilistic distribution of wind speed, and comparison with the wind speed data taken from different sources and measured at different geographical locations in the world has been made. This family of distributions is developed by introducing a pre-exponential term to the theoretical distribution derived from the Maximum Entropy Principle (MEP). The statistical analysis parameter based on the wind power density is used as the suitability judgement for the distribution functions. It is shown that the MEP-type distributions not only agree better with a variety of the measured wind speed data than the conventionally used empirical Weibull distribution, but also can represent the wind power density much more accurately. Therefore, the MEP-type distributions are more suitable for the assessment of the wind energy potential and the performance of wind energy conversion systems. (author)

  13. Statistical analysis of wind speed using two-parameter Weibull distribution in Alaçatı region

    International Nuclear Information System (INIS)

    Ozay, Can; Celiktas, Melih Soner

    2016-01-01

    Highlights: • Wind speed & direction data from September 2008 to March 2014 has been analyzed. • Mean wind speed for the whole data set has been found to be 8.11 m/s. • Highest wind speed is observed in July with a monthly mean value of 9.10 m/s. • Wind speed with the most energy has been calculated as 12.77 m/s. • Observed data has been fit to a Weibull distribution and k &c parameters have been calculated as 2.05 and 9.16. - Abstract: Weibull Statistical Distribution is a common method for analyzing wind speed measurements and determining wind energy potential. Weibull probability density function can be used to forecast wind speed, wind density and wind energy potential. In this study a two-parameter Weibull statistical distribution is used to analyze the wind characteristics of Alaçatı region, located in Çeşme, İzmir. The data used in the density function are acquired from a wind measurement station in Alaçatı. Measurements were gathered on three different heights respectively 70, 50 and 30 m between 10 min intervals for five and half years. As a result of this study; wind speed frequency distribution, wind direction trends, mean wind speed, and the shape and the scale (k&c) Weibull parameters have been calculated for the region. Mean wind speed for the entirety of the data set is found to be 8.11 m/s. k&c parameters are found as 2.05 and 9.16 in relative order. Wind direction analysis along with a wind rose graph for the region is also provided with the study. Analysis suggests that higher wind speeds which range from 6–12 m/s are prevalent between the sectors 340–360°. Lower wind speeds, from 3 to 6 m/s occur between sectors 10–29°. Results of this study contribute to the general knowledge about the regions wind energy potential and can be used as a source for investors and academics.

  14. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Directory of Open Access Journals (Sweden)

    Jose Javier Gorgoso-Varela

    2016-04-01

    Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.

  15. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  16. A study of two estimation approaches for parameters of Weibull distribution based on WPP

    International Nuclear Information System (INIS)

    Zhang, L.F.; Xie, M.; Tang, L.C.

    2007-01-01

    Least-squares estimation (LSE) based on Weibull probability plot (WPP) is the most basic method for estimating the Weibull parameters. The common procedure of this method is using the least-squares regression of Y on X, i.e. minimizing the sum of squares of the vertical residuals, to fit a straight line to the data points on WPP and then calculate the LS estimators. This method is known to be biased. In the existing literature the least-squares regression of X on Y, i.e. minimizing the sum of squares of the horizontal residuals, has been used by the Weibull researchers. This motivated us to carry out this comparison between the estimators of the two LS regression methods using intensive Monte Carlo simulations. Both complete and censored data are examined. Surprisingly, the result shows that LS Y on X performs better for small, complete samples, while the LS X on Y performs better in other cases in view of bias of the estimators. The two methods are also compared in terms of other model statistics. In general, when the shape parameter is less than one, LS Y on X provides a better model; otherwise, LS X on Y tends to be better

  17. Influence of the Determination Methods of K and C Parameters on the Ability of Weibull Distribution to Suitably Estimate Wind Potential and Electric Energy

    Directory of Open Access Journals (Sweden)

    Ruben M. Mouangue

    2014-05-01

    Full Text Available The modeling of the wind speed distribution is of great importance for the assessment of wind energy potential and the performance of wind energy conversion system. In this paper, the choice of two determination methods of Weibull parameters shows theirs influences on the Weibull distribution performances. Because of important calm winds on the site of Ngaoundere airport, we characterize the wind potential using the approach of Weibull distribution with parameters which are determined by the modified maximum likelihood method. This approach is compared to the Weibull distribution with parameters which are determined by the maximum likelihood method and the hybrid distribution which is recommended for wind potential assessment of sites having nonzero probability of calm. Using data provided by the ASECNA Weather Service (Agency for the Safety of Air Navigation in Africa and Madagascar, we evaluate the goodness of fit of the various fitted distributions to the wind speed data using the Q – Q plots, the Pearson’s coefficient of correlation, the mean wind speed, the mean square error, the energy density and its relative error. It appears from the results that the accuracy of the Weibull distribution with parameters which are determined by the modified maximum likelihood method is higher than others. Then, this approach is used to estimate the monthly and annual energy productions of the site of the Ngaoundere airport. The most energy contribution is made in March with 255.7 MWh. It also appears from the results that a wind turbine generator installed on this particular site could not work for at least a half of the time because of higher frequency of calm. For this kind of sites, the modified maximum likelihood method proposed by Seguro and Lambert in 2000 is one of the best methods which can be used to determinate the Weibull parameters.

  18. A flexible Weibull extension

    International Nuclear Information System (INIS)

    Bebbington, Mark; Lai, C.-D.; Zitikis, Ricardas

    2007-01-01

    We propose a new two-parameter ageing distribution which is a generalization of the Weibull and study its properties. It has a simple failure rate (hazard rate) function. With appropriate choice of parameter values, it is able to model various ageing classes of life distributions including IFR, IFRA and modified bathtub (MBT). The ranges of the two parameters are clearly demarcated to separate these classes. It thus provides an alternative to many existing life distributions. Details of parameter estimation are provided through a Weibull-type probability plot and maximum likelihood. We also derive explicit formulas for the turning points of the failure rate function in terms of its parameters. This, combined with the parameter estimation procedures, will allow empirical estimation of the turning points for real data sets, which provides useful information for reliability policies

  19. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING WEIBULL PARAMETERS FOR WIND SPEED DISTRIBUTION IN THE DISTRICT OF MAROUA

    Directory of Open Access Journals (Sweden)

    D. Kidmo Kaoga

    2014-12-01

    Full Text Available In this study, five numerical Weibull distribution methods, namely, the maximum likelihood method, the modified maximum likelihood method (MLM, the energy pattern factor method (EPF, the graphical method (GM, and the empirical method (EM were explored using hourly synoptic data collected from 1985 to 2013 in the district of Maroua in Cameroon. The performance analysis revealed that the MLM was the most accurate model followed by the EPF and the GM. Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely recommended to estimate the scale and shape parameters for an accurate and efficient wind energy potential evaluation.

  20. Modeling the kinetics of cobalt Fischer-Tropsch catalyst deactivation trends through an innovative modified Weibull distribution.

    Science.gov (United States)

    Khorashadizadeh, Mahdi; Atashi, Hossein

    2017-07-26

    Since the increase in clean energy demand is driven by environmental concerns, energy management is an ever-lasting issue globally. Among the different scenarios for energy manufacturing, the catalytic route through the famous process named Fischer-Tropsch Synthesis provides beneficial consequences including pollution reduction and economic efficiency, among others. In this regard, catalyst stability must be taken into account as a crucial performance parameter, especially in the expensive cobalt-catalyzed CO hydrogenation processes. As catalyst deactivation seems to be inevitable in catalytic processes, deactivation issues such as the extent, failure rate, or reactivation significantly influence the exploration, development, design, and operation of commercial processes. Accordingly, the deactivation trend of a cobalt-based catalyst was modeled via an innovative Weibull distribution base, which presents a significant advance over the existing macroscopic deactivation models. Being employed to obtain informative equations, the model parameters provide valuable information about the catalyst lifetime, which can be used as a useful predictive tool for industrial control purposes.

  1. Applying Weibull Distribution and Discriminant Function Techniques to Predict Damage Cup Anemometers in the 2011 PHM Competition

    Directory of Open Access Journals (Sweden)

    Joshua Cassity

    2012-01-01

    Full Text Available Cup anemometers are frequently employed in the wind power industry for wind resource assessment at prospective wind farm sites. In this paper, we demonstrate a method for identifying faulty three cup anemometers. This method is applicable to cases where data is available from two or more anemometers at equal height and cases where data is available from anemometers at different heights. It is based on examining the Weibull parameters of the distribution generated from the difference between the anemometer’s reported measurements and utilizing a discriminant function technique to separate out the data corresponding to bad cup anemometers. For anemometers at different heights, only data from the same height pair combinations are compared. In addition, various preprocessing techniques are discussed to improve performance of the algorithm. These include removing data that corresponds to poor wind directions for comparing the anemometers and removing data that corresponds to frozen anemometers. These methods are employed on the data from the PHM 2011 Data Competition with results presented.

  2. Redundancy allocation problem of a system with increasing failure rates of components based on Weibull distribution: A simulation-based optimization approach

    International Nuclear Information System (INIS)

    Guilani, Pedram Pourkarim; Azimi, Parham; Niaki, S.T.A.; Niaki, Seyed Armin Akhavan

    2016-01-01

    The redundancy allocation problem (RAP) is a useful method to enhance system reliability. In most works involving RAP, failure rates of the system components are assumed to follow either exponential or k-Erlang distributions. In real world problems however, many systems have components with increasing failure rates. This indicates that as time passes by, the failure rates of the system components increase in comparison to their initial failure rates. In this paper, the redundancy allocation problem of a series–parallel system with components having an increasing failure rate based on Weibull distribution is investigated. An optimization method via simulation is proposed for modeling and a genetic algorithm is developed to solve the problem. - Highlights: • The redundancy allocation problem of a series–parallel system is aimed. • Components possess an increasing failure rate based on Weibull distribution. • An optimization method via simulation is proposed for modeling. • A genetic algorithm is developed to solve the problem.

  3. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    Science.gov (United States)

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (pcompared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  4. Maximum likelihood estimation of the parameters of a bivariate Gaussian-Weibull distribution from machine stress-rated data

    Science.gov (United States)

    Steve P. Verrill; David E. Kretschmann; James W. Evans

    2016-01-01

    Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...

  5. Characterization of the wind behavior in Botucatu-SP region (Brazil) by Weibull distributing; Caracterizacao do comportamento eolico da regiao de Botucatu-SP atraves da distribuicao de Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Gabriel Filho, Luis Roberto Almeida [Universidade Estadual Paulista (CE/UNESP), Tupa, SP (Brazil). Coordenacao de Estagio; Cremasco, Camila Pires [Faculdade de Tecnologia de Presidente Prudente, SP (Brazil); Seraphim, Odivaldo Jose [Universidade Estadual Paulista (FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Faculdade de Engenharia

    2008-07-01

    The wind behavior of a region can be described by frequency distribution that provide information and characteristics needed for a possible deployment of wind energy harvesting in the region. These characteristics, such as the annual average speed, the variance and shunting line standard of the registered speeds and the density of aeolian power average hourly, can be gotten by the frequency of occurrence of determined speed, that in turn must be studied through analytical expressions. The more adjusted analytical function for aeolian distributions is the function of density of Weibull, that can be determined by numerical methods and linear regressions. Once you have determined this function, all wind characteristics mentioned above may be determined accurately. The objective of this work is to characterize the aeolian behavior in the region of Botucatu-SP and to determine the energy potential for implementation of aeolian turbines. For the development of the present research, was used an Monitorial Young Wind anemometer of Campbell company installed a 10 meters of height. The experiment was developed in the Nucleus of Alternative Energies and Renewed - NEAR of the Laboratory of Agricultural Energize of the Department of Agricultural Engineering of the UNESP, Agronomy Sciences Faculty, Lageado Experimental Farm, located in the city of Botucatu - SP. The geographic localization is defined by the coordinates 22 deg 51' South latitude (S) and 48 deg 26' Longitude West (W) and average altitude of 786 meters above sea level. The analysis was carried through using registers of speed of the wind during the period of September of 2004 the September of 2005. After determined the distribution of frequencies of the hourly average speed of the wind, it was determined function of associated Weibull, thus making possible the determination of the annual average speed of the wind (2,77 m/s), of the shunting line standard of the registered speeds (0,55 m/s), of the

  6. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  7. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning.

    Science.gov (United States)

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-06-29

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.

  8. A Study on the Effect of Nudging on Long-Term Boundary Layer Profiles of Wind and Weibull Distribution Parameters in a Rural Coastal Area

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier

    2013-01-01

    By use of 1 yr of measurements performed with a wind lidar up to 600-m height, in combination with a tall meteorological tower, the impact of nudging on the simulated wind profile at a flat coastal site (Høvsøre) in western Denmark using the Advanced Research version of the Weather Research...... the scatter between the simulated and measured wind speeds, expressed by the root-mean-square error, by about 20% between altitudes of 100 and 500 m. The root-mean-square error was nearly constant with height for the nudged case (~2.2 m s−1) and slightly increased with height for the nonnudged one, reaching 2.......8 m s−1 at 300 and 500 m. In studying the long-term wind speed variability with the Weibull distribution, it was found that nudging had a minor effect on the scale parameter profile, which is closely connected to the mean wind speed. Improvement by nudging was seen on the profile of the shape...

  9. Weibull and lognormal Taguchi analysis using multiple linear regression

    International Nuclear Information System (INIS)

    Piña-Monarrez, Manuel R.; Ortiz-Yañez, Jesús F.

    2015-01-01

    The paper provides to reliability practitioners with a method (1) to estimate the robust Weibull family when the Taguchi method (TM) is applied, (2) to estimate the normal operational Weibull family in an accelerated life testing (ALT) analysis to give confidence to the extrapolation and (3) to perform the ANOVA analysis to both the robust and the normal operational Weibull family. On the other hand, because the Weibull distribution neither has the normal additive property nor has a direct relationship with the normal parameters (µ, σ), in this paper, the issues of estimating a Weibull family by using a design of experiment (DOE) are first addressed by using an L 9 (3 4 ) orthogonal array (OA) in both the TM and in the Weibull proportional hazard model approach (WPHM). Then, by using the Weibull/Gumbel and the lognormal/normal relationships and multiple linear regression, the direct relationships between the Weibull and the lifetime parameters are derived and used to formulate the proposed method. Moreover, since the derived direct relationships always hold, the method is generalized to the lognormal and ALT analysis. Finally, the method’s efficiency is shown through its application to the used OA and to a set of ALT data. - Highlights: • It gives the statistical relations and steps to use the Taguchi Method (TM) to analyze Weibull data. • It gives the steps to determine the unknown Weibull family to both the robust TM setting and the normal ALT level. • It gives a method to determine the expected lifetimes and to perform its ANOVA analysis in TM and ALT analysis. • It gives a method to give confidence to the extrapolation in an ALT analysis by using the Weibull family of the normal level.

  10. arXiv Describing dynamical fluctuations and genuine correlations by Weibull regularity

    CERN Document Server

    Nayak, Ranjit K.; Sarkisyan-Grinbaum, Edward K.; Tasevsky, Marek

    The Weibull parametrization of the multiplicity distribution is used to describe the multidimensional local fluctuations and genuine multiparticle correlations measured by OPAL in the large statistics $e^{+}e^{-} \\to Z^{0} \\to hadrons$ sample. The data are found to be well reproduced by the Weibull model up to higher orders. The Weibull predictions are compared to the predictions by the two other models, namely by the negative binomial and modified negative binomial distributions which mostly failed to fit the data. The Weibull regularity, which is found to reproduce the multiplicity distributions along with the genuine correlations, looks to be the optimal model to describe the multiparticle production process.

  11. Comparison of Weibull and Probit Analysis in Toxicity Testing of ...

    African Journals Online (AJOL)

    HP

    relationships to assess the toxic effects of chemical substances. These models range from very simple models to extremely complicated models for which the eventual functional forms cannot be easily expressed as single equations. Specifically, these models are (i) tolerance distribution models: log-probit, probit, Weibull, ...

  12. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

    Directory of Open Access Journals (Sweden)

    Jenq-Daw Lee

    2008-07-01

    Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

  13. Optimization of Weibull deteriorating items inventory model under ...

    Indian Academy of Sciences (India)

    In this study, we have discussed the development of an inventory model when the deterioration rate of the item follows Weibull two parameter distributions under the effect of selling price and time dependent demand, since, not only the selling price, but also the time is a crucial factor to enhance the demand in the market as ...

  14. Caracterização analítica e geométrica da metodologia geral de determinação de distribuições de Weibull para o regime eólico e suas aplicações Analytical and geometric characterization of general methodology of determination of Weibull distribution for wind regime and its applications

    Directory of Open Access Journals (Sweden)

    Luís R. A Gabriel Filho

    2011-02-01

    Full Text Available O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.The wind regime of a region can be described by frequency distributions that provide information and features extremely necessary for a possible deployment of wind systems of energy capturing in the region and the resulting applications in rural areas in remote regions. These features, such as the annual average speed, variance of speed and hourly average of wind power density, can be obtained by the frequency of occurrences of certain speed, which in turn should be studied through analytical expressions. The analytic

  15. Uncertainty Evaluation of Weibull Estimators through Monte Carlo Simulation: Applications for Crack Initiation Testing

    Directory of Open Access Journals (Sweden)

    Jae Phil Park

    2016-06-01

    Full Text Available The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.

  16. Gumbel Weibull distribution function for Sahel precipitation ...

    African Journals Online (AJOL)

    user

    insecurity, migration, social conflicts, etc.). An efficient management of under and over ground water is a ... affects their incomes (Udual and Ini, 2012). Researches on modeling, prediction and forecasting ... Douentza in Mopti region, situated on the national road 15 highway linking Mopti to Gao and Kidal regions. This small ...

  17. Weibull-k Revisited: “Tall” Profiles and Height Variation of Wind Statistics

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Troen, Ib; Ejsing Jørgensen, Hans

    2014-01-01

    The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution wi...

  18. Calculation of the ceramics Weibull parameters

    Czech Academy of Sciences Publication Activity Database

    Fuis, Vladimír; Návrat, Tomáš

    2011-01-01

    Roč. 58, - (2011), s. 642-647 ISSN 2010-376X. [International Conference on Bioinformatics and Biomedicine 2011. Bali, 26.10.2011-28.10.2011] Institutional research plan: CEZ:AV0Z20760514 Keywords : biomaterial parameters * Weibull statistics * ceramics Subject RIV: BO - Biophysics http://www.waset.org/journals/waset/v58/v58-132.pdf

  19. Brain responses strongly correlate with Weibull image statistics when processing natural images

    NARCIS (Netherlands)

    Scholte, H.S.; Ghebreab, S.; Waldorp, L.; Smeulders, A.W.M.; Lamme, V.A.F.

    2009-01-01

    The visual appearance of natural scenes is governed by a surprisingly simple hidden structure. The distributions of contrast values in natural images generally follow a Weibull distribution, with beta and gamma as free parameters. Beta and gamma seem to structure the space of natural images in an

  20. Weibull- k Revisited: "Tall" Profiles and Height Variation of Wind Statistics

    Science.gov (United States)

    Kelly, Mark; Troen, Ib; Jørgensen, Hans E.

    2014-07-01

    The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter ( k), as well as mean wind speed. Towards the aim of improving predictions of the Weibull- profile, we develop expressions for the profile of long-term variance of wind speed, including a method extending our probabilistic wind-profile theory; together these two profiles lead to a profile of Weibull-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85-110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull- k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion of the models' efficacy and applicability. The latter includes a comparative evaluation of Wieringa-type empirical models and perturbed-geostrophic forms with regard to surface-layer behaviour, as well as for heights where climatological wind-speed variability is not dominated by surface effects.

  1. Effects of Specimen Size on The Flexural Strength and Weibull Modulus of Nuclear Graphite

    International Nuclear Information System (INIS)

    Chi, Se Hwan; Kim, Dae In; Kim, Eung Seon; Hong, Sung Deok; Kim, Yong Wan

    2010-01-01

    Flexural strength and the Weibull modulus of porous graphite are key material data for the design, and safety and lifetime evaluation of VHTR graphite core components. For brittle materials like graphite, it follows from the experiments that the mean strength of a set of large specimens is smaller than the mean strength of a set of small specimens since it is more likely to find a major flaw in a large than in a small. This size effects of strength is the most prominent and relevant consequence of the statistical behavior of the strength of brittle materials. Weibull was the first to develop a statistical theory of brittle fracture based on the weakest link hypothesis. Using some empirical arguments necessary to make a simple and good fitting of his experimental data he derived the Weibull distribution of the probability of failure, where the Weibull parameter, m, is being used in the probabilistic methods in materials science and structure mechanics, for example, in the design and safety analysis of graphite core components in VHTR. The purpose of the present study is to investigate the specimen size effects on the flexural strength and Weibull modulus of nuclear graphite of different coke particle sizes and different forming methods

  2. A study of the slope of cox proportional hazard and Weibull models ...

    African Journals Online (AJOL)

    However, when the distributional assumptions for Weibull Model is not satisfied, Cox Proportional Hazard Model will be used, although semi-parametric, because it possessed a similar characteristic of covariates inclusion. The main objective of this research work is to determine if the cox proportional hazard model depend ...

  3. Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

    Science.gov (United States)

    Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

    2012-01-01

    A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

  4. A general Bayes weibull inference model for accelerated life testing

    International Nuclear Information System (INIS)

    Dorp, J. Rene van; Mazzuchi, Thomas A.

    2005-01-01

    This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example

  5. Statistical Analysis of Wind Power Density Based on the Weibull and Rayleigh Models of Selected Site in Malaysia

    Directory of Open Access Journals (Sweden)

    Aliashim Albani

    2014-02-01

    Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.

  6. Weibull statistical analysis of tensile strength of vascular bundle in inner layer of moso bamboo culm in molecular parasitology and vector biology.

    Science.gov (United States)

    Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

    2014-07-01

    Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.

  7. WEISTRABA - a code for the numerical analysis of Weibull stress parameters from ABAQUS finite element stress analysis. Procedural background and code description

    International Nuclear Information System (INIS)

    Riesch-Oppermann, H.; Brueckner-Foit, A.

    1998-08-01

    Numerical analysis are used within the framework of the local approach to determine the critical stress at cleavage fracture. A set of ABAQUS post-processing modules serving this purpose is described in this report. The modules are intended to perform several steps that are necessary to obtain the parameters of the Weibull distribution of the critical Weibull stress at cleavage fracture. The main steps are determination of the first principal stress envelope at the experimentally obtained load levels at fracture, calculation of the Weibull stresses at fracture and an iterative maximum likelihood procedure for the distribution parameters of the Weibull stress. Some remarks on limits/modifications of the model in case of other mechanisms are also included in the report. (orig.)

  8. Combined EEG/MEG can outperform single modality EEG or MEG source reconstruction in presurgical epilepsy diagnosis.

    Directory of Open Access Journals (Sweden)

    Ümit Aydin

    Full Text Available We investigated two important means for improving source reconstruction in presurgical epilepsy diagnosis. The first investigation is about the optimal choice of the number of epileptic spikes in averaging to (1 sufficiently reduce the noise bias for an accurate determination of the center of gravity of the epileptic activity and (2 still get an estimation of the extent of the irritative zone. The second study focuses on the differences in single modality EEG (80-electrodes or MEG (275-gradiometers and especially on the benefits of combined EEG/MEG (EMEG source analysis. Both investigations were validated with simultaneous stereo-EEG (sEEG (167-contacts and low-density EEG (ldEEG (21-electrodes. To account for the different sensitivity profiles of EEG and MEG, we constructed a six-compartment finite element head model with anisotropic white matter conductivity, and calibrated the skull conductivity via somatosensory evoked responses. Our results show that, unlike single modality EEG or MEG, combined EMEG uses the complementary information of both modalities and thereby allows accurate source reconstructions also at early instants in time (epileptic spike onset, i.e., time points with low SNR, which are not yet subject to propagation and thus supposed to be closer to the origin of the epileptic activity. EMEG is furthermore able to reveal the propagation pathway at later time points in agreement with sEEG, while EEG or MEG alone reconstructed only parts of it. Subaveraging provides important and accurate information about both the center of gravity and the extent of the epileptogenic tissue that neither single nor grand-averaged spike localizations can supply.

  9. Combined EEG/MEG can outperform single modality EEG or MEG source reconstruction in presurgical epilepsy diagnosis.

    Science.gov (United States)

    Aydin, Ümit; Vorwerk, Johannes; Dümpelmann, Matthias; Küpper, Philipp; Kugel, Harald; Heers, Marcel; Wellmer, Jörg; Kellinghaus, Christoph; Haueisen, Jens; Rampp, Stefan; Stefan, Hermann; Wolters, Carsten H

    2015-01-01

    We investigated two important means for improving source reconstruction in presurgical epilepsy diagnosis. The first investigation is about the optimal choice of the number of epileptic spikes in averaging to (1) sufficiently reduce the noise bias for an accurate determination of the center of gravity of the epileptic activity and (2) still get an estimation of the extent of the irritative zone. The second study focuses on the differences in single modality EEG (80-electrodes) or MEG (275-gradiometers) and especially on the benefits of combined EEG/MEG (EMEG) source analysis. Both investigations were validated with simultaneous stereo-EEG (sEEG) (167-contacts) and low-density EEG (ldEEG) (21-electrodes). To account for the different sensitivity profiles of EEG and MEG, we constructed a six-compartment finite element head model with anisotropic white matter conductivity, and calibrated the skull conductivity via somatosensory evoked responses. Our results show that, unlike single modality EEG or MEG, combined EMEG uses the complementary information of both modalities and thereby allows accurate source reconstructions also at early instants in time (epileptic spike onset), i.e., time points with low SNR, which are not yet subject to propagation and thus supposed to be closer to the origin of the epileptic activity. EMEG is furthermore able to reveal the propagation pathway at later time points in agreement with sEEG, while EEG or MEG alone reconstructed only parts of it. Subaveraging provides important and accurate information about both the center of gravity and the extent of the epileptogenic tissue that neither single nor grand-averaged spike localizations can supply.

  10. On Weibull's Spectrum of Nonrelativistic Energetic Particles at IP Shocks: Observations and Theoretical Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Pallocchia, G.; Laurenza, M.; Consolini, G. [INAF—Istituto di Astrofisica e Planetologia Spaziali, Via Fosso del Cavaliere 100, I-00133 Roma (Italy)

    2017-03-10

    Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.

  11. Percentile-based Weibull diameter distribution model for Pinus ...

    African Journals Online (AJOL)

    Using a site index equation and stem volume model developed for Pinus kesiya in the Philippines, a yield prediction system was created to predict the volume per ha (VPH) for each diameter class and, subsequently, the total volume of a stand. To evaluate the yield prediction system, the predicted mean VPH for each ...

  12. An exponential distribution

    International Nuclear Information System (INIS)

    Anon

    2009-01-01

    In this presentation author deals with the probabilistic evaluation of product life on the example of the exponential distribution. The exponential distribution is special one-parametric case of the weibull distribution.

  13. A comparison of Weibull and βIc analyses of transition range data

    International Nuclear Information System (INIS)

    McCabe, D.E.

    1991-01-01

    Specimen size effects on K Jc data scatter in the transition range of fracture toughness have been explained by external (weakest link) statistics. In this investigation, compact specimens of A 533 grade B steel were tested in sizes ranging from 1/2TC(T) to 4TC(T) with sufficient replication to obtain good three-parameter Weibull characterization of data distributions. The optimum fitting parameters for an assumed Weibull slope of 4 were calculated. External statistics analysis was applied to the 1/2TC(T) data to predict median K Jc values for 1TC(T), 2TC(T), and 4TC(T) specimens. The distributions from experimentally developed 1TC(T), 2TC(T), and 4TC(T) data tended to confirm the predictions. However, the extremal prediction model does not work well at lower-shelf toughness. At -150 degree C the extremal model predicts a specimen size effect where in reality there is no size effect

  14. Weibull Analysis of the Behavior on Tensile Strength of Hemp Fibers for Different Intervals of Fiber Diameters

    Science.gov (United States)

    Rohen, Lázaro A.; Margem, Frederico M.; Neves, Anna C. C.; Gomes, Maycon A.; Monteiro, Sérgio N.; Vieira, Carlos Maurício F.; de Castro, Rafael G.; Borges, Gustavo X.

    Economic and environmental benefits are motivating studies on natural fibers, especially lignocellulosic extracted from plants, have been studied to substitute synthetic fibers, such as glass fiber as reinforcement in polymer matrices. By contrast to synthetic fibers, natural fibers have the disadvantage of being heterogeneous in their dimensions specially the diameter. About the hemp fiber, little is known of their dimensional characteristics. The aim of the present work was to statistically characterize the distribution of the diameter of hemp fibers. Based on this characterization, diameter intervals were set and the dependence of the tensile strength of theses fibers with a corresponding diameter was analyzed by the Weibull method. The diameter was measured with precision using a profile projector. Tensile tests were conducted on each fiber obtain mechanical strength. The results interpreted by Weibull statistical showed a correlation between the resistances of the fiber to its diameter.

  15. Prediction and reconstruction of future and missing unobservable modified Weibull lifetime based on generalized order statistics

    Directory of Open Access Journals (Sweden)

    Amany E. Aly

    2016-04-01

    Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.

  16. On the Performance Analysis of Digital Communications over Weibull-Gamma Channels

    KAUST Repository

    Ansari, Imran Shafique

    2015-05-01

    In this work, the performance analysis of digital communications over a composite Weibull-Gamma (WG) multipath-fading and shadowing channel is presented wherein WG distribution is appropriate for modeling fading environments when multipath is superimposed on shadowing. More specifically, in this work, exact closed-form expressions are derived for the probability density function, the cumulative distribution function, the moment generating function, and the moments of a composite WG channel. Capitalizing on these results, new exact closed-form expressions are offered for the outage probability, the higher- order amount of fading, the average error rate for binary and M-ary modulation schemes, and the ergodic capacity under various types of transmission policies, mostly in terms of Meijer\\'s G functions. These new analytical results were also verified via computer-based Monte-Carlo simulation results. © 2015 IEEE.

  17. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... for degradation modeling and failure criteria determination. The time dependent accumulated damage is assumed linearly proportional to the time dependent degradation level. It is observed that the deterministic accumulated damage at the level of unity closely estimates the characteristic fatigue life of Weibull...

  18. Influence of the Testing Gage Length on the Strength, Young's Modulus and Weibull Modulus of Carbon Fibres and Glass Fibres

    Directory of Open Access Journals (Sweden)

    Luiz Claudio Pardini

    2002-10-01

    Full Text Available Carbon fibres and glass fibres are reinforcements for advanced composites and the fiber strength is the most influential factor on the strength of the composites. They are essentially brittle and fail with very little reduction in cross section. Composites made with these fibres are characterized by a high strength/density ratio and their properties are intrisically related to their microstructure, i.e., amount and orientation of the fibres, surface treatment, among other factors. Processing parameters have an important role in the fibre mechanical behaviour (strength and modulus. Cracks, voids and impurities in the case of glass fibres and fibrillar misalignments in the case of carbon fibres are created during processing. Such inhomogeneities give rise to an appreciable scatter in properties. The most used statistical tool that deals with this characteristic variability in properties is the Weibull distribution. The present work investigates the influence of the testing gage length on the strength, Young's modulus and Weibull modulus of carbon fibres and glass fibres. The Young's modulus is calculated by two methods: (i ASTM D 3379M, and (ii interaction between testing equipment/specimen The first method resulted in a Young modulus of 183 GPa for carbon fibre, and 76 GPa for glass fibre. The second method gave a Young modulus of 250 GPa for carbon fibre and 50 GPa for glass fibre. These differences revelead differences on how the interaction specimen/testing machine can interfere in the Young modulus calculations. Weibull modulus can be a tool to evaluate the fibre's homogeneity in terms of properties and it is a good quality control parameter during processing. In the range of specimen gage length tested the Weibull modulus for carbon fibre is ~ 3.30 and for glass fibres is ~ 5.65, which indicates that for the batch of fibres tested, the glass fibre is more uniform in properties.

  19. Determination of Weibull Analysis of the Hypereutectic Silumins Reliability in Failure Time Respect

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2009-07-01

    Full Text Available The results of dynamic evaluation of the reliability of hypereutectic AlSi17Cu3NiMg silumin under the effect of symmetrical cyclic tensile-compressive stresses were presented. Studies were carried out on a normal-running fatigue testing machine, which was the mechanically driven resonant pulsator. For the needs of quantitative reliability evaluation and the time-to-failure evaluation, the procedures used in survival analysis, adapted to the analysis of failure-free operation with two- and three-parametric Weibull distributions, were applied. The values of the parameters were estimated using the method of maximum reliability and a rank-based non-parametric method. The results of the evaluation of the reliability and damage intensity are an important element in the determination of casting quality and enable a reliable estimation of the operational suitability time.

  20. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    Science.gov (United States)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  1. Determining the parameters of Weibull function to estimate the wind power potential in conditions of limited source meteorological data

    Science.gov (United States)

    Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.

    2017-04-01

    We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for

  2. Weibull aging models for the single protective channel unavailability analysis by the device of stages

    International Nuclear Information System (INIS)

    Nunes, M.E.C.; Noriega, H.C.; Melo, P.F.F.

    1997-01-01

    Among the features to take into account in the unavailability analysis of protective channels, there is one that plays a dominant role - that of considering the equipment aging. In this sense, the exponential failure model is not adequate, since some transition rates are no more constant. As a consequence, Markovian models cannot be used anymore. As an alternative, one may use the device of stages that allows for transforming a Non Markovian model into an equivalent Markovian one by insertion of a fictitious states set, called stages. For a given time-dependent transition rate, its failure density is analysed as to the best combination of exponential distributions and then the moments of the original distribution and those of the combination are matched to estimate the necessary parameters. In this paper, the aging of the protective channel is supposed to follow Weibull distributions. Typical means and variances for the times to failure are considered and combinations of stages are checked. Initial conditions features are discussed in connection with states that are fictitious and to check the validity of the developed models. Alternative solutions by the discretization of the failure rates are generated. The results obtained agree quite well. (author). 7 refs., 6 figs

  3. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  4. The effect of mis-specification on mean and selection between the Weibull and lognormal models

    Science.gov (United States)

    Jia, Xiang; Nadarajah, Saralees; Guo, Bo

    2018-02-01

    The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.

  5. Evaluation of wind power production prospective and Weibull parameter estimation methods for Babaurband, Sindh Pakistan

    International Nuclear Information System (INIS)

    Khahro, Shahnawaz Farhan; Tabbassum, Kavita; Soomro, Amir Mahmood; Dong, Lei; Liao, Xiaozhong

    2014-01-01

    Highlights: • Weibull scale and shape parameters are calculated using 5 numerical methods. • Yearly mean wind speed is 6.712 m/s at 80 m height with highest in May 9.595 m/s. • Yearly mean WPD is 310 W/m 2 and available energy density is 2716 kWh/m 2 at 80 m height. • Probability of higher wind speeds is more in spring and summer than in autumn and winter. • Estimated cost of per kWh of electricity from wind is calculated as 0.0263 US$/kWh. - Abstract: Pakistan is currently experiencing an acute shortage of energy and urgently needs new sources of affordable energy that could alleviate the misery of the energy starved masses. At present the government is increasing not only the conventional energy sources like hydel and thermal but also focusing on the immense potential of renewable energy sources like; solar, wind, biogas, waste-to-energy etc. The recent economic crisis worldwide, global warming and climate change have also emphasized the need for utilizing economic feasible energy sources having lowest carbon emissions. Wind energy, with its sustainability and low environmental impact, is highly prominent. The aim of this paper is to explore the wind power production prospective of one of the sites in south region of Pakistan. It is worth mentioning here that this type of detailed analysis is hardly done for any location in Pakistan. Wind power densities and frequency distributions of wind speed at four different altitudes along with estimated wind power expected to be generated through commercial wind turbines is calculated. Analysis and comparison of 5 numerical methods is presented in this paper to determine the Weibull scale and shape parameters for the available wind data. The yearly mean wind speed of the considered site is 6.712 m/s and has power density of 310 W/m 2 at 80 m height with high power density during April to August (highest in May with wind speed 9.595 m/s and power density 732 W/m 2 ). Economic evaluation, to exemplify feasibility

  6. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  7. Probabilistic physics-of-failure models for component reliabilities using Monte Carlo simulation and Weibull analysis: a parametric study

    International Nuclear Information System (INIS)

    Hall, P.L.; Strutt, J.E.

    2003-01-01

    In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of

  8. Evaluation of Electrical Tree Degradation in Cross-Linked Polyethylene Cable Using Weibull Process of Propagation Time

    Directory of Open Access Journals (Sweden)

    Donguk Jang

    2017-11-01

    Full Text Available The main purpose of this paper is to evaluate electrical tree degradation for cross-linked polyethylene (XLPE cable insulation for three difference models. In order to show the distribution characteristics using phase resolved partial discharge (PD, we acquire data by using a PD detecting system. These acquired data presented four 2D distributions such as phase angle-average discharge distribution, pulse magnitude-pulse number distribution, phase angle-pulse number distribution, and phase angle-maximum discharge derived from the distribution of PD. From the analysis of these distributions, each of the tree models are proved to hold its unique characteristics and the results were then applied as basic specific qualities. In order to evaluate the progresses of an electrical tree, we proposed methods using parameters by means of Weibull distribution to the time of tree propagation. We measured the time of tree propagation for 16 specimens of each artificial tree models from initiation stage, middle stage, and final stage respectively, using these breakdown data, we estimated the shape parameter, scale parameter, and mean time to failure. It is possible to analyze the difference in lifetime between the initial stage, the middle stage, and the final stage, and could be used to predict the lifetime of an XLPE cable from these results.

  9. Weibull strength variations between room temperature and high temperature Ni-3YSZ half-cells

    DEFF Research Database (Denmark)

    Curran, Declan; Frandsen, Henrik Lund; Hendriksen, Peter Vang

    2013-01-01

    and 800°C in a reducing atmosphere. The strength of an as sintered half-cell was also measured at room temperature for comparison. Weibull analysis was performed on large sample sets of 30 for statistical viability. The Weibull strength and elastic modulus of the room temperature tested reduced samples...... show a decrease of approximately 33% and 51% respectively, when compared to the oxidized samples tested at room temperature. When tested at elevated temperatures both Weibull strength and elastic modulus decrease further when compared to the room temperature reduced samples. However these further...... efficiency, increased degradation and/or the complete termination of a functioning stack. This paper investigates the effects of temperature on the mechanical strength of 3% yttria-stabilised zirconia half-cells. Strength was measured using a four-point bend method at room temperature and at 600°C, 700°C...

  10. Estimating the creep strain to failure of PP at different load levels based on short term tests and Weibull characterization

    Directory of Open Access Journals (Sweden)

    L. M. Vas

    2012-12-01

    Full Text Available The short and long term creep behavior is one of the most important properties of polymers used for engineering applications. In order to study this kind of behavior of PP tensile and short term creep measurements were performed and analyzed using long term creep behavior estimating method based on short term tensile and creep tests performed at room temperature, viscoelastic behavior, and variable transformations. Applying Weibull distribution based approximations for the measured curves predictions for the creep strain to failure depending on the creep load were determined and the parameters were found by fitting the measurements. The upper, mean, and lower estimations as well as the confidence interval for the means give a possibility for designers' calculations at arbitrary creep load levels.

  11. Analysis of the fuzzy greatest of CFAR detector in homogeneous and non-homogeneous Weibull clutter title

    Science.gov (United States)

    Baadeche, Mohamed; Soltani, Faouzi

    2015-12-01

    In this paper, we analyze the distributed FGO-CFAR detector in homogeneous and Non-Homogeneous Weibull clutter with an assumption of known shape parameter. The non-homogeneity is modeled by the presence of a clutter edge in the reference window. We derive membership function which maps the observations to the false alarm space and compute the threshold at the data fusion center. Applying the `Maximum', `Minimum', `Algebraic Sum' and `Algebraic Product' fuzzy rules for L detectors considered at the data fusion center, the obtained results showed that the best performance is obtained by the `Algebraic Product' fuzzy rule followed by the `Minimum' one and in these two cases the probability of detection increases significantly with the number of detectors.

  12. Weibull statistics effective area and volume in the ball-on-ring testing method

    DEFF Research Database (Denmark)

    Frandsen, Henrik Lund

    2014-01-01

    to geometries relevant for the application of the material, the effective area or volume for the test specimen must be evaluated. In this work analytical expressions for the effective area and volume of the ball-on-ring test specimen is derived. In the derivation the multiaxial stress field has been accounted...... for by use of the Weibull theory, and the multinomial theorem has been used to handle the integration of multiple terms raised to the power of the Weibull modulus. The analytical solution is verified with a high number of finite element models for various geometric parameters. The finite element model...

  13. Efficient Weibull channel model for salinity induced turbulent underwater wireless optical communications

    KAUST Repository

    Oubei, Hassan M.

    2017-12-13

    Recent advances in underwater wireless optical communications necessitate a better understanding of the underwater channel. We propose the Weibull model to characterize the fading of salinity induced turbulent underwater wireless optical channels. The model shows an excellent agreement with the measured data under all channel conditions.

  14. a study of the slope of cox proportional hazard and weibull models

    African Journals Online (AJOL)

    Adejumo & Ahmadu

    Keywords: Cox Proportional Hazard Model, Weibull Model,. Slope, Shape parameters, Scale parameter, Survival time. INTRODUCTION. Survival analysis studies the amount of time that it takes before a particular event, such as death, occurrence of a disease, marriage, divorce, occurs. However, the same techniques can ...

  15. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    Science.gov (United States)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  16. Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

    Science.gov (United States)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1991-01-01

    Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

  17. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    Science.gov (United States)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2013-01-01

    Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

  18. Probability Distribution Function of the Upper Equatorial Pacific Current Speeds

    National Research Council Canada - National Science Library

    Chu, Peter C

    2005-01-01

    ...), constructed from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events...

  19. Some challenges of wind modelling for modern wind turbines: The Weibull distribution

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekatarina; Floors, Rogier

    2012-01-01

    Wind power assessments, as well as forecast of wind energy production, are key issues in wind energy and grid related studies. However the hub height of today’s wind turbines is well above the surface layer. Wind profiles studies based on mast data show that the wind profile above the surface layer...

  20. Effect of Endocrown Restorations with Different CAD/CAM Materials: 3D Finite Element and Weibull Analyses

    Directory of Open Access Journals (Sweden)

    Laden Gulec

    2017-01-01

    Full Text Available The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP. Two types of designs without and with 3 mm intraradicular extensions, endocrown (E and modified endocrown (ME, were modeled on a 3D Finite element (FE model of the MFP. Vitablocks Mark II (VMII, Vita Enamic (VE, and Lava Ultimate (LU CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU.

  1. Effect of Endocrown Restorations with Different CAD/CAM Materials: 3D Finite Element and Weibull Analyses.

    Science.gov (United States)

    Gulec, Laden; Ulusoy, Nuran

    2017-01-01

    The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM) materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP). Two types of designs without and with 3 mm intraradicular extensions, endocrown (E) and modified endocrown (ME), were modeled on a 3D Finite element (FE) model of the MFP. Vitablocks Mark II (VMII), Vita Enamic (VE), and Lava Ultimate (LU) CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU.

  2. Lifetime modelling with a Weibull law: comparison of three Bayesian Methods

    International Nuclear Information System (INIS)

    Billy, F.; Remy, E.; Bousquet, N.; Celeux, G.

    2006-01-01

    For a nuclear power plant, being able to estimate the lifetime of important components is strategic. But data is usually insufficient to do so. Thus, it is relevant to use expertise, together with data, in order to assess the value of lifetime on the grounds of both sources. The Bayesian frame and the choice of a Weibull law to model the random time for replacement are relevant. They have been chosen for this article. Two indicators are computed : the mean lifetime of any component and the mean residual lifetime of a given component, after it has been controlled. Three different Bayesian methods are compared on three sets of data. The article shows that the three methods lead to coherent results and that uncertainties are strongly reduced. The method developed around PMC has two main advantages: it models a conditional dependence of the two parameters of the Weibull law, which enables more coherent results on the prior; it has a parameter that weights the strength of the expertise. This last point is very important to do lifetime assessments, because then, expertise is not used to increase too small samples as much as to do a real extrapolation, far beyond what data itself say. (authors)

  3. Utilization of Weibull equation to obtain soil-water diffusivity in horizontal infiltration

    International Nuclear Information System (INIS)

    Guerrini, I.A.

    1982-06-01

    Water movement was studied in horizontal infiltration experiments using laboratory columns of air-dry and homogeneous soil to obtain a simple and suitable equation for soil-water diffusivity. Many water content profiles for each one of the ten soil columns utilized were obtained through gamma-ray attenuation technique using a 137 Cs source. During the measurement of a particular water content profile, the soil column was held in the same position in order to measure changes in time and so to reduce the errors in water content determination. The Weibull equation utilized was excellent in fitting water content profiles experimental data. The use of an analytical function for ν, the Boltzmann variable, according to Weibull model, allowed to obtain a simple equation for soil water diffusivity. Comparisons among the equation here obtained for diffusivity and others solutions found in literature were made, and the unsuitability of a simple exponential variation of diffusivity with water content for the full range of the latter was shown. The necessity of admitting the time dependency for diffusivity was confirmed and also the possibility fixing that dependency on a well known value extended to generalized soil water infiltration studies was found. Finally, it was shown that the soil water diffusivity function given by the equation here proposed can be obtained just by the analysis of the wetting front advance as a function of time. (Author) [pt

  4. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  5. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    OpenAIRE

    Ju Feng; Wen Zhong Shen

    2015-01-01

    Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distributions of wind speed and wind direction, which is based on the parameters of sector-wise Weibull distributions and interpolations between direction sectors. It is applied to the wind measurement data a...

  6. Foam-forming properties of Ilex paraguariensis (mate saponin: foamability and foam lifetime analysis by Weibull equation

    Directory of Open Access Journals (Sweden)

    Janine Treter

    2010-01-01

    Full Text Available Saponins are natural soaplike foam-forming compounds widely used in foods, cosmetic and pharmaceutical preparations. In this work foamability and foam lifetime of foams obtained from Ilex paraguariensis unripe fruits were analyzed. Polysorbate 80 and sodium dodecyl sulfate were used as reference surfactants. Aiming a better data understanding a linearized 4-parameters Weibull function was proposed. The mate hydroethanolic extract (ME and a mate saponin enriched fraction (MSF afforded foamability and foam lifetime comparable to the synthetic surfactants. The linearization of the Weibull equation allowed the statistical comparison of foam decay curves, improving former mathematical approaches.

  7. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  8. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  9. A study of optimization problem for amplify-and-forward relaying over weibull fading channels

    KAUST Repository

    Ikki, Salama Said

    2010-09-01

    This paper addresses the power allocation and relay positioning problems in amplify-and-forward cooperative networks operating in Weibull fading environments. We study adaptive power allocation (PA) with fixed relay location, optimal relay location with fixed power allocation, and joint optimization of the PA and relay location under total transmit power constraint, in order to minimize the outage probability and average error probability at high signal-to-noise ratios (SNR). Analytical results are validated by numerical simulations and comparisons between the different optimization schemes and their performance are provided. Results show that optimum PA brings only coding gain, while optimum relay location yields, in addition to the latter, diversity gains as well. Also, joint optimization improves both, the diversity gain and coding gain. Furthermore, results illustrate that the analyzed adaptive algorithms outperform uniform schemes. ©2010 IEEE.

  10. Fracture Strength, Failure Types, and Weibull Characteristics of Three-Unit Zirconia Fixed Dental Prostheses After Cyclic Loading: Effects of Veneering and Air-Abrasion Protocols.

    Science.gov (United States)

    Campos, Fernanda; Souza, Rodrigo Oa; Bottino, Marco A; Özcan, Mutlu

    The required connector dimension for zirconia fixed dental prostheses (FDPs) may be a clinical limitation due to limited space in the occlusogingival direction. Using no veneering in the gingival regions of the pontics and connectors may solve this problem. This study evaluated the mechanical durability of zirconia FDPs with and without veneering in the gingival area of the connectors and pontics and subsequent air abrasion of this region with different protocols. Models were made of resin abutments (diameter = 6 or 8 mm, height = 6 mm, 6 degrees convergence) and embedded in polyurethane resin (distance = 11 mm). Zirconia frameworks were milled and randomly distributed by veneering (veneering of the entire framework [VEN] or no veneering at gingival regions of the pontic and connector [NVEN]) and by air-abrasion (Al₂O₃/SiO₂, 30 μm; or 45 μm Al₂O₃. FDPs were adhesively cemented and subjected to mechanical cycling (1,200,000 cycles, 200 N, 4 Hz, with water cooling). Specimens were tested until fracture (1 mm/min), and failure modes were classified. Data (N) were subjected to one-way analysis of variance in two sets, Tukey test (α = .05) and Weibull analysis. While veneering did not significantly affect the results (VEN: 1,958 ± 299 N; NVEN: 1,788 ± 152 N; P = .094), air abrasion did (P = .006), with the worst results for the groups conditioned with 45 μm Al₂O₃ (SiO₂: 1,748 ± 273 N; Al₂O₃: 1,512 ± 174 N). The NVEN group demonstrated the highest Weibull modulus (12.8) compared with the other groups (5.3-7.2). Fractures commonly initiated from the gingival side of the connector. Veneering of the gingival region of the connectors and pontics in zirconia FDPs did not diminish the fracture strength, but air-abrasion of this area with 45 μm Al₂O₃ decreased the results.

  11. Effect of thermocycling on flexural strength and weibull statistics of machinable glass-ceramic and composite resin.

    Science.gov (United States)

    Peampring, Chaimongkon; Sanohkan, Sasiwimol

    2014-12-01

    To evaluate the durability of machinable dental restorative materials, this study performed an experiment to evaluate the flexural strength and Weibull statistics of a machinable lithium disilicate glass-ceramic and a machinable composite resin after being thermocycled for certain cycles. A total of 40 bar-shape specimens of were prepared with the dimension of 20 mm × 4 mm × 2 mm, which were divided into four groups of 10 specimens. Ten specimens of machinable lithium disilicate glass-ceramic (IPS e.max CAD, Ivoclar Vivadent, Liechtenstein) and 10 specimens of machinable composite resin (Paradigm MZ 100, 3M ESPE, USA) were subjected to 3-point flexural strength test. Other 10 specimens of each material were thermocycled between water temperature of 5 and 55 °C for 10,000 cycles. After that, they were tested using 3-point flexural strength test. Statistical analysis was performed using two-way analysis of variance and Tukey multiple comparisons. Weibull analysis was performed to evaluate the reliability of the strength. Means of strength and their standard deviation were: thermocycled IPS e.max CAD 389.10 (50.75), non-thermocycled IPS e.max CAD 349.96 (38.34), thermocycled Paradigm MZ 100 157.51 (12.85), non-thermocycled Paradigm MZ 100 153.33 (19.97). Within each material group, there was no significant difference in flexural strength between thermocycled and non-thermocycled specimens. Considering the Weibull analysis, there was no statistical difference of Weibull modulus in all experimental groups. Within the limitation of this study, the results showed that there was no significant effect of themocycling on flexural strength and Weibull modulus of a machinable glass-ceramic and a machinable composite resin.

  12. The effect of core material, veneering porcelain, and fabrication technique on the biaxial flexural strength and weibull analysis of selected dental ceramics.

    Science.gov (United States)

    Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean

    2012-07-01

    The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0.87 for e.max pair); however, e.max CAD/Press groups had

  13. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Science.gov (United States)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  14. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  15. Spatial and temporal patterns of global onshore wind speed distribution

    International Nuclear Information System (INIS)

    Zhou, Yuyu; Smith, Steven J

    2013-01-01

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/climate forecast system reanalysis (CFSR) data over land areas. The Weibull distribution performs well in fitting the time series wind speed data at most locations according to R 2 , root mean square error, and power density error. The wind speed frequency distribution, as represented by the Weibull k parameter, exhibits a large amount of spatial variation, a regionally varying amount of seasonal variation, and relatively low decadal variation. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in non-negligible errors. While large-scale wind speed data are often presented in the form of mean wind speeds, these results highlight the need to also provide information on the wind speed frequency distribution. (letter)

  16. The distribution of first-passage times and durations in FOREX and future markets

    Science.gov (United States)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  17. Linear vs. piecewise Weibull model for genetic evaluation of sires for longevity in Simmental cattle

    Directory of Open Access Journals (Sweden)

    Nikola Raguž

    2014-09-01

    Full Text Available This study was focused on genetic evaluation of longevity in Croatian Simmental cattle using linear and survival models. The main objective was to create a genetic model that is most appropriate to describe the longevity data. Survival analysis, using piecewise Weibull proportional hazards model, used all information on the length of productive life including censored as well as uncensored observations. Linear models considered culled animals only. The relative milk production within herd had a highest impact on cows’ longevity. In comparison of estimated genetic parameters among methods, survival analysis yielded higher heritability value (0.075 than linear sire (0.037 and linear animal model (0.056. When linear models were used, genetic trend of Simmental bulls for longevity was slightly increasing over the years, unlike a decreasing trend in case of survival analysis methodology. Average reliability of bulls’ breeding values was higher in case of survival analysis. The rank correlations between survival analysis and linear models bulls’ breeding values for longevity were ranged between 0.44 and 0.46 implying huge differences in ranking of sires.

  18. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  19. Comparação entre as distribuições normal e de Weibull para análise da resistência à compressão do concreto (doi:10.5216/reec.v9i3.28814

    Directory of Open Access Journals (Sweden)

    Paulo Eduardo Teodoro

    2014-11-01

    Full Text Available RESUMO: O projeto de estruturas de concreto possui uma modelagem matemática de natureza bastante subjetiva. Portanto, objetivou-se com esta pesquisa verificar se as distribuições Normal e de Weibull podem ser aplicadas aos dados resistências à compressão do concreto pronto, agrupados comercialmente. O estudo foi realizado durante o ano de 2011 na cidade de Campo Grande/MS. A resistência à compressão foi avaliada em ensaios de 189 amostras aos 28 dias a partir de diferentes construções de concreto armado realizados na cidade. Os ensaios ocorreram conforme prescrito pela NBR 5739 (ABNT, 2007. Para quantificar o grau em que a distribuição Normal e de Weibull se ajustaram os dados experimentais foram utilizados três testes de adequação: qui-quadrado, Anderson-Darling e Kolmogorov-Smirnov. Com base no presente estudo, a distribuição Weibull pode ser aplicada aos dados de resistência à compressão para concreto. Isto sugere que, apesar de os complexos processos envolvidos na falha de compressão para um material compósito quase frágil como o concreto, um modelo de força estatística é eficaz. Além disso, ao comparar os testes de ajuste, há grande diferença prática entre as distribuições Normal e de Weibull. Esta informação é uma importante adição experimental para a literatura científica no que diz respeito à ruptura de materiais “semi-frágeis”. ABSTRACT: The design of concrete structures and their mathematical modeling is rather subjective in its nature. Therefore, it is the purpose of this study to see whether the Weibull or Normal distributions can be applied to the compressive strengths of commercially batched ready-mixed concrete. The study was conducted during the year 2011 in the city of Campo Grande / MS. The compressive strength was evaluated in 189 test samples at 28 days from different concrete constructions conducted in the city. The trials took place as prescribed by NBR 5739 (ABNT, 2007. To

  20. On the performance of dual-hop mixed RF/FSO wireless communication system in urban area over aggregated exponentiated Weibull fading channels with pointing errors

    Science.gov (United States)

    Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian

    2018-03-01

    The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.

  1. Probability distribution of machining center failures

    International Nuclear Information System (INIS)

    Jia Yazhou; Wang Molin; Jia Zhixin

    1995-01-01

    Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed

  2. Comparison of Weibull and Lognormal Cure Models with Cox in the Survival Analysis Of Breast Cancer Patients in Rafsanjan.

    Science.gov (United States)

    Hoseini, Mina; Bahrampour, Abbas; Mirzaee, Moghaddameh

    2017-02-16

    Breast cancer is the most common cancer after lung cancer and the second cause of death. In this study we compared Weibull and Lognormal Cure Models with Cox regression on the survival of breast cancer. A cohort study. The current study retrospective cohort study was conducted on 140 patients referred to Ali Ibn Abitaleb Hospital, Rafsanjan southeastern Iran from 2001 to 2015 suffering from breast cancer. We determined and analyzed the effective survival causes by different models using STATA14. According to AIC, log-normal model was more consistent than Weibull. In the multivariable Lognormal model, the effective factors like smoking, second -hand smoking, drinking herbal tea and the last breast-feeding period were included. In addition, using Cox regression factors of significant were the disease grade, size of tumor and its metastasis (p-valuecancer was studied and the results showed that the effect of pesticides on breast cancer was not in agreement with the models used in this study. Based on different methods for survival analysis, researchers can decide how they can reach a better conclusion. This comparison indicates the result of semi-parametric Cox method is closer to clinical experiences evidences.

  3. Influence of Hydrophilic Polymers on the β Factor in Weibull Equation Applied to the Release Kinetics of a Biologically Active Complex of Aesculus hippocastanum

    Directory of Open Access Journals (Sweden)

    Justyna Kobryń

    2017-01-01

    Full Text Available Triterpenoid saponins complex of biological origin, escin, exhibits significant clinical activity in chronic venous insufficiency, skin inflammation, epidermal abrasions, allergic dermatitis, and acute impact injuries, especially in topical application. The aim of the study is the comparison of various hydrogel formulations, as carriers for a horse chestnut seed extract (EH. Methylcellulose (MC, two polyacrylic acid derivatives (PA1 and PA2, and polyacrylate crosspolymer 11 (PC-11 were employed. The release rates of EH were examined and a comparison with the Weibull model equation was performed. Application of MC as the carrier in the hydrogel preparation resulted in fast release rate of EH, whereas in the case of the hydrogel composed with PC-11 the release was rather prolonged. Applied Weibull function adhered best to the experimental data. Due to the evaluated shape parameter β, in the Weibull equation, the systems under study released the active compound according to the Fickian diffusion.

  4. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull Postharvest evaluation and estimate of shelf-life of fresh guava using the Weibull model

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P Guava is a tropical fruit susceptible to undesirable alterations that affect the shelf-life due to inadequate conditions of storage and packing. In this work the shelf-life of guava in fresh using the probabilistic model of Weibull was considered and the quality of the fruits was estimated during storage to different conditions of temperature and packing. The postharvest evaluation was made during 15 days with guavas variety `Red Regional´. The completely randomized design and factorial design with 3 factors: storage time with 6 levels (0, 3, 6, 9, 12, 15 days, storage temperature with

  5. Reliability prediction of I&C cable insulation materials by DSC and Weibull theory for probabilistic safety assessment of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Santhosh, T.V., E-mail: santutv@barc.gov.in [Reactor Safety Division, Bhabha Atomic Research Centre (India); Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre (India); Ghosh, A.K. [Raja Ramanna Fellow, Department of Atomic Energy (India); Fernandes, B.G. [Department of Electrical Engineering, Indian Institute of Technology Bombay (India); Dubey, K.A. [Radiation Technology Development Division, Bhabha Atomic Research Centre (India)

    2016-01-15

    Highlights: • An approach for time dependent reliability prediction of I&C cable insulation materials for use in PSA of NPP has been developed based on OIT and OITp measurement, and Weibull theory. • OITs were determined from the measured OITp based on the fundamental thermodynamics principles, and the correlations obtained from DSC and FTIR are in good agreement with the EAB. • The SEM of thermal and irradiated samples of insulation materials was performed to support the degradation behaviour observed from OIT and EAB measurements. • The proposed methodology has been illustrated with the accelerated thermal and radiation ageing data on low voltage cables used in NPP for I&C applications. • The time dependent reliability predicted from the OIT based on Weibull theory will be useful in incorporating the cable ageing into PSA of NPP. - Abstract: Instrumentation and control (I&C) cables used in nuclear power plants (NPPs) are exposed to various deteriorative environmental effects during their operational lifetime. The factors consisting of long-term irradiation and enhanced temperature eventually result in insulation degradation. Monitoring of the actual state of the cable insulation and the prediction of their residual service life consist of the measurement of the properties that are directly proportional to the functionality of the cables (usually, elongation at break is used as the critical parameter). Although, several condition monitoring (CM) and life estimation techniques are available, currently there is no any standard methodology or an approach towards incorporating the cable ageing effects into probabilistic safety assessment (PSA) of NPPs. In view of this, accelerated thermal and radiation ageing of I&C cable insulation materials have been carried out and the degradation due to thermal and radiation ageing has been assessed using oxidation induction time (OIT) and oxidation induction temperature (OITp) measurements by differential scanning

  6. Reliability prediction of I&C cable insulation materials by DSC and Weibull theory for probabilistic safety assessment of NPPs

    International Nuclear Information System (INIS)

    Santhosh, T.V.; Gopika, V.; Ghosh, A.K.; Fernandes, B.G.; Dubey, K.A.

    2016-01-01

    Highlights: • An approach for time dependent reliability prediction of I&C cable insulation materials for use in PSA of NPP has been developed based on OIT and OITp measurement, and Weibull theory. • OITs were determined from the measured OITp based on the fundamental thermodynamics principles, and the correlations obtained from DSC and FTIR are in good agreement with the EAB. • The SEM of thermal and irradiated samples of insulation materials was performed to support the degradation behaviour observed from OIT and EAB measurements. • The proposed methodology has been illustrated with the accelerated thermal and radiation ageing data on low voltage cables used in NPP for I&C applications. • The time dependent reliability predicted from the OIT based on Weibull theory will be useful in incorporating the cable ageing into PSA of NPP. - Abstract: Instrumentation and control (I&C) cables used in nuclear power plants (NPPs) are exposed to various deteriorative environmental effects during their operational lifetime. The factors consisting of long-term irradiation and enhanced temperature eventually result in insulation degradation. Monitoring of the actual state of the cable insulation and the prediction of their residual service life consist of the measurement of the properties that are directly proportional to the functionality of the cables (usually, elongation at break is used as the critical parameter). Although, several condition monitoring (CM) and life estimation techniques are available, currently there is no any standard methodology or an approach towards incorporating the cable ageing effects into probabilistic safety assessment (PSA) of NPPs. In view of this, accelerated thermal and radiation ageing of I&C cable insulation materials have been carried out and the degradation due to thermal and radiation ageing has been assessed using oxidation induction time (OIT) and oxidation induction temperature (OITp) measurements by differential scanning

  7. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  8. Size effect on strength and lifetime probability distributions of ...

    Indian Academy of Sciences (India)

    The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to ... The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.

  9. Diameter Distribution for Gmelina Arborea (ROXB) Plantations in ...

    African Journals Online (AJOL)

    Weibull distribution-based models were developed and applied in the characterization of trees for sustainable production of Gmelina arborea in Ukpon River Forest Reserve, Cross River State, Nigeria. Forty-two sample plots of size 20m X 20m were randomly laid in the 19 - 33-year G. arborea age series and complete ...

  10. An eoq model for weibull deteriorating item with ramp type demand and salvage value under trade credit system

    Directory of Open Access Journals (Sweden)

    Lalit Mohan Pradhan

    2014-03-01

    Full Text Available Background: In the present competitive business scenario researchers have developed various inventory models for deteriorating items considering various practical situations for better inventory control. Permissible delay in payments with various demands and deteriorations is considerably a new concept introduced in developing various inventory models. These models are very useful for both the consumers and the manufacturer. Methods: In the present work an inventory model has been developed for a three parameter Weibull deteriorating item with ramp type demand and salvage value under trade credit system. Here we have considered a single item for developing the model. Results and conclusion: Optimal order quantity, optimal cycle time and total variable cost during a cycle have been derived for the proposed inventory model. The results obtained in this paper have been illustrated with the help of numerical examples and sensitivity analysis.   

  11. Evaluation of uncertainty in experimental active buckling control of a slender beam-column with disturbance forces using Weibull analysis

    Science.gov (United States)

    Enss, Georg C.; Platz, Roland

    2016-10-01

    Buckling of slender load-bearing beam-columns is a crucial failure scenario in light-weight structures as it may result in the collapse of the entire structure. If axial load and load capacity are unknown, stability becomes uncertain. To compensate this uncertainty, the authors successfully developed and evaluated an approach for active buckling control for a slender beam-column, clamped at the base and pinned at the upper end. Active lateral forces are applied with two piezoelectric stack actuators in opposing directions near the beam-column' clamped base to prevent buckling. A Linear Quadratic Regulator is designed and implemented on the experimental demonstrator and statistical tests are conducted to prove effectivity of the active approach. The load capacity of the beam-column could be increased by 40% and scatter of buckling occurrences for increasing axial loads is reduced. Weibull analysis is used to evaluate the increase of the load capacity and its related uncertainty compensation.

  12. WEIBULL MULTIPLICATIVE MODEL AND MACHINE LEARNING MODELS FOR FULL-AUTOMATIC DARK-SPOT DETECTION FROM SAR IMAGES

    Directory of Open Access Journals (Sweden)

    A. Taravat

    2013-09-01

    Full Text Available As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method, synthetic aperture radar (SAR can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks. As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  13. Weibull Multiplicative Model and Machine Learning Models for Full-Automatic Dark-Spot Detection from SAR Images

    Science.gov (United States)

    Taravat, A.; Del Frate, F.

    2013-09-01

    As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  14. Análisis probabilístico de elementos de vidrio recocido mediante una distribución triparamétrica Weibull

    Directory of Open Access Journals (Sweden)

    Alberto Ramos

    2015-07-01

    Con todo ello, el trabajo pretende generalizar la utilización de la fdd triparamétrica de Weibull en elementos de vidrio estructural con distribuciones de carga no descritas analíticamente, permitiendo aplicar el modelo probabilista propuesto a distribuciones generales de solicitación.

  15. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity.

    Directory of Open Access Journals (Sweden)

    James D Englehardt

    Full Text Available Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a toxicokinetic models, (b biologically-based network models, (c scholastic and psychological test score data for children with prenatal mercury exposure, and (d time-to-tumor data of the ED01 study.

  16. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  17. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  18. Fique Fiber Tensile Elastic Modulus Dependence with Diameter Using the Weibull Statistical Analysis

    OpenAIRE

    Teles,Maria Carolina Andrade; Altoé,Giulio Rodrigues; Amoy Netto,Pedro; Colorado,Henry; Margem,Frederico Muylaert; Monteiro,Sergio Neves

    2015-01-01

    Fique is a plant native of Colombia with fibers extracted from its leaves presenting relevant physical characteristics and mechanical properties for possible engineering applications, such as reinforcement of polymer composites. The main physico-mechanical properties of the fique fiber have already been investigated for both untreated and mechanically treated fibers. The statistical distribution of the fique fiber diameter was analyzed and the effect of microfibrillar angle on the tensile str...

  19. Compositional Analyses and Shelf-Life Modeling of Njangsa (Ricinodendron heudelotii) Seed Oil Using the Weibull Hazard Analysis.

    Science.gov (United States)

    Abaidoo-Ayin, Harold K; Boakye, Prince G; Jones, Kerby C; Wyatt, Victor T; Besong, Samuel A; Lumor, Stephen E

    2017-08-01

    This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleostearic acid (α-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid was also present in appreciable amounts (approximately 34%). Our investigations also indicated that the acid-catalyzed transesterification of NSO resulted in lower yields of α-ESA methyl esters, due to isomerization, a phenomenon which was not observed under basic conditions. The triacylglycerol (TAG) profile analysis showed the presence of at least 1 α-ESA fatty acid chain in more than 95% of the oil's TAGs. Shelf-life was determined by the Weibull Hazard Sensory Method, where the end of shelf-life was defined as the time at which 50% of panelists found the flavor of NSO to be unacceptable. This was determined as 21 wk. Our findings therefore support the potential commercial viability of NSO as an important source of physiologically beneficial PUFAs. © 2017 Institute of Food Technologists®.

  20. Weibull Statistical Analysis on the Mechanical Properties of SiC by Immersion in Acidic and Alkaline Solutions

    International Nuclear Information System (INIS)

    Ahn, Seok-Hwan; Jeong, Sang-Cheol; Nam, Ki-Woo

    2016-01-01

    A Weibull statistical analysis of the mechanical properties of SiC ceramics was carried out by immersion in acidic and alkaline solutions. The heat treatment was carried out at 1373 K. The corrosion of SiC was carried out in acidic and alkaline solutions under KSL1607. The bending strength of corroded crack-healed specimens decreased 47 % and 70 % compared to those of uncorroded specimens in acidic and alkaline solutions, respectively. The corrosion of SiC ceramics is faster in alkaline solution than in acid solution. The scale and shape parameters were evaluated for the as-received and corroded materials, respectively. The shape parameter of the as-received material corroded in acidic and alkaline solutions was significantly more apparent in the acidic solution. Further, the heat-treated material was large in acidic solution but small in alkaline solution. The shape parameters of the as-received and heat-treated materials were smaller in both acidic and alkaline solutions

  1. Approximation of the breast height diameter distribution of two-cohort stands by mixture models III Kernel density estimators vs mixture models

    Science.gov (United States)

    Rafal Podlaski; Francis A. Roesch

    2014-01-01

    Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...

  2. Approximation of the breast height diameter distribution of two-cohort stands by mixture models II Goodness-of-fit tests

    Science.gov (United States)

    Rafal Podlaski; Francis .A. Roesch

    2013-01-01

    The goals of this study are (1) to analyse the accuracy of the approximation of empirical distributions of diameter at breast height (dbh) using two-component mixtures of either the Weibull distribution or the gamma distribution in two−cohort stands, and (2) to discuss the procedure of choosing goodness−of−fit tests. The study plots were...

  3. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  4. Recurrent frequency-size distribution of characteristic events

    Directory of Open Access Journals (Sweden)

    S. G. Abaimov

    2009-04-01

    Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.

  5. Impact of Blending on Strength Distribution of Ambient Cured Metakaolin and Palm Oil Fuel Ash Based Geopolymer Mortar

    Directory of Open Access Journals (Sweden)

    Taliat Ola Yusuf

    2014-01-01

    Full Text Available This paper investigates the influence of blending of metakaolin with silica rich palm oil fuel ash (POFA on the strength distribution of geopolymer mortar. The broadness of strength distribution of quasi-brittle to brittle materials depends strongly on the existence of flaws such as voids, microcracks, and impurities in the material. Blending of materials containing alumina and silica with the objective of improving the performance of geopolymer makes comprehensive characterization necessary. The Weibull distribution is used to study the strength distribution and the reliability of geopolymer mortar specimens prepared from 100% metakaolin, 50% and 70% palm and cured under ambient condition. Mortar prisms and cubes were used to test the materials in flexure and compression, respectively, at 28 days and the results were analyzed using Weibull distribution. In flexure, Weibull modulus increased with POFA replacement, indicating reduced broadness of strength distribution from an increased homogeneity of the material. Modulus, however, decreased with increase in replacement of POFA in the specimens tested under compression. It is concluded that Weibull distribution is suitable for analyses of the blended geopolymer system. While porous microstructure is mainly responsible for flexural failure, heterogeneity of reaction relics is responsible for the compression failure.

  6. Corrigendum to ;Assessing tephra total grain-size distribution: Insights from field data analysis; [Earth Planet. Sci. Lett. 443 (2016) 90-107

    Science.gov (United States)

    Costa, A.; Pioli, L.; Bonadonna, C.

    2017-05-01

    The authors found a mistake in the formulation of the distribution named Bi-Weibull distribution reported in the equation (A.2) of the Appendix A. The error affects equation (4) (which is the same as eq. (A.2)) and Table 4 in the original manuscript.

  7. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  8. Effects of Mixtures on Liquid and Solid Fragment Size Distributions

    Science.gov (United States)

    2016-05-01

    nuclear weapon detonations ; cased munitions including sympathetic detonations ; debris from high-speed projectile penetration; bubble-bearing magmatic...eruptions of bubble- bearing magma. –1.15 ±0.08 3 Durand & Soulard (2012, 13) Shock -loaded metal melts resulting in planar jets. 1. Converted... detonating a 35mm-long 1.05g granulated pellet of tetryl. Figure 1 shows a typical result. Figure 1. Type II Weibull distribution vs. test data

  9. Optimum single modal and bimodal buckling design of symmetric laminates

    Science.gov (United States)

    Qian, B.; Reiss, R.; Aung, W.

    1989-01-01

    Variational calculus is used to determine the design that maximizes the resistance of classical symmetric laminates against buckling. The orientations of the constituent orthotropic laminae with respect to the principal axes of the laminate are the design variables. It is shown that the optimal design may not be a point of analyticity of the buckling load. Local analytic extrema are obtained from the design derivatives of the buckling load. Nonanalytic extrema occur whenever the buckling load is a repeated eigenvalue. A novel approach, using a directional design derivative, is employed to determine nonanalytic extrema. Specific examples are presented for biaxial buckling for several different boundary conditions.

  10. Empleo de la función Weibull para evaluar la emergencia de las plántulas de Albizia lebbeck (L. Benth

    Directory of Open Access Journals (Sweden)

    Marlen Navarro

    Full Text Available Con el objetivo de conocer el vigor de las semillas de Albizia lebbeck mediante la evaluación de la emergencia de plántulas, a través de la función Weibull modificada, se realizó la siembra en tres condiciones ambientales y en diferentes tiempos de almacenamiento de la semilla. El diseño fue completamente aleatorizado, con arreglo factorial. Se realizó análisis de varianza para los parámetros M (emergencia acumulada máxima, k (tasa de emergencia y Z (retraso para el inicio de la emergencia de la función Weibull modificada. A partir de los seis meses de iniciado el almacenamiento (44,1 % se observó la pérdida brusca del porcentaje de M en el vivero (A y ligeras variaciones en la cabina (C, en comparación con A y B (sombreador. El ámbito de dispersión del parámetro k osciló entre 0,4-2,6; 0,29-1,9 y 0,5-1,4 % emergencia d-1 para las evaluaciones realizadas en A, B y C, respectivamente. Del análisis de Z se interpretó que el tiempo para el inicio de la emergencia, sin distinción del ambiente de siembra, estuvo enmarcado entre los 3,0 y 7,3 días posteriores a la siembra. En el vivero a pleno sol, en la evaluación a 6 mdia (meses de iniciado el almacenamiento, se obtuvieron los mejores resultados de los parámetros biológicos de la ecuación de Weibull, lo cual permitió un análisis global que indicó un grado de vigor alto en las semillas de A. lebbeck, en comparación con las restantes evaluaciones

  11. Do Insect Populations Die at Constant Rates as They Become Older? Contrasting Demographic Failure Kinetics with Respect to Temperature According to the Weibull Model.

    Directory of Open Access Journals (Sweden)

    Petros Damos

    Full Text Available Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off, but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard

  12. Evaluation of the reliability of Si3N4-Al2O3 -CTR2O3 ceramics through Weibull analysis

    Directory of Open Access Journals (Sweden)

    Santos Claudinei dos

    2003-01-01

    Full Text Available The objective of this work has been to compare the reliability of two Si3N4 ceramics, with Y2O3/Al2O3 or CTR2O3/Al2O3 mixtures as additives, in regard to their 4-point bending strength and to confirm the potential of the rare earth oxide mixture, CTR2O3, produced at FAENQUIL, as an alternative, low cost sinter additive for pure Y2O3 in the sintering of Si3N4 ceramics. The oxide mixture CTR2O3 is a solid solution formed mainly by Y2O3, Er2O3, Yb2O3 and Dy2O3 with other minor constituents and is obtained at a cost of only 20% of pure Y2O3. Samples were sintered by a gas pressure sintering process at 1900 °C under a nitrogen pressure of 1.5 MPa and an isothermal holding time of 2 h. The obtained materials were characterized by their relative density, phase composition and bending strength. The Weibull analysis was used to describe the reliability of these materials. Both materials produced presented relative densities higher than 99.5%t.d., b-Si3N4 and Y3Al5O12 (YAG as cristalline phases and bending strengths higher than 650 MPa, thus demonstrating similar behaviors regarding their physical, chemical and mechanical characteristics. The statistical analysis of their strength also showed similar results for both materials, with Weibull moduli m of about 15 and characteristic stress values s o of about 700 MPa. These results confirmed the possibility of using the rare earth oxide mixture, CTR2O3, as sinter additive for high performance Si3N4 ceramics, without prejudice of the mechanical properties when compared to Si3N4 ceramics sintered with pure Y2O3.

  13. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    Science.gov (United States)

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. The Analysis of Phytoplankton Abundance Using Weibull Distribution (A Case Study in the Coastal Area of East Yapen in the Regency of Yapen Islands, Papua)

    Science.gov (United States)

    Indrayani, Ervina; Dimara, Lisiard; Paiki, Kalvin; Reba, Felix

    2018-01-01

    The coastal waters of East Yapen is one of the spawning sites and areas of care for marine biota in Papua. Because of its very open location, it is widely used by human activities such as fishing, residential, industrial and cruise lines. This indirectly affects the balance of coastal waters condition of East Yapen that impact on the existence of…

  15. Long-Term Profiles of Wind and Weibull Distribution Parameters up to 600 m in a Rural Coastal and an Inland Suburban Area

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Batchvarova, Ekaterina; Floors, Rogier Ralph

    2014-01-01

    An investigation of the long-term variability of wind profiles for wind energy applications is presented. The observations consists of wind measurements obtained from a ground-based wind lidar at heights between 100 and 600 m, in combination with measurements from tallmeteorological towers at a f...

  16. Weibull Wind-Speed Distribution Parameters Derived from a Combination of Wind-Lidar and Tall-Mast Measurements Over Land, Coastal and Marine Sites

    DEFF Research Database (Denmark)

    Gryning, Sven-Erik; Floors, Rogier Ralph; Peña, Alfredo

    2016-01-01

    Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is ex...

  17. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  18. Shapes of reaction-time distributions and shapes of learning curves: a test of the instance theory of automaticity.

    Science.gov (United States)

    Logan, G D

    1992-09-01

    The instance theory assumes that automatic performance is based on single-step direct-access retrieval from memory of prior solutions to present problems. The theory predicts that the shape of the learning curve depends on the shape of the distribution of retrieval times. One can deduce from the fundamental assumptions of the theory that (1) the entire distribution of reaction times, not just the mean, will decrease as a power function of practice; (2) asymptotically, the retrieval-time distribution must be a Weibull distribution; and (3) the exponent of the Weibull, which is the parameter that determines its shape, must be the reciprocal of the exponent of the power function. These predictions were tested and mostly confirmed in 12 data sets from 2 experiments. The ability of the instance theory to predict the power law is contrasted with the ability of other theories to account for it.

  19. Comparative analysis of methods for modelling the short-term probability distribution of extreme wind turbine loads

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    2016-01-01

    extrapolation techniques: the Weibull, Gumbel and Pareto distributions and a double-exponential asymptotic extreme value function based on the ACER method. For the successful implementation of a fully automated extrapolation process, we have developed a procedure for automatic identification of tail threshold...... levels, based on the assumption that the response tail is asymptotically Gumbel distributed. Example analyses were carried out, aimed at comparing the different methods, analysing the statistical uncertainties and identifying the factors, which are critical to the accuracy and reliability...

  20. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    García Mogollón Carlos

    2010-09-01

    Full Text Available

    La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un diseño completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  1. Evaluación poscosecha y estimación de vida útil de guayaba fresca utilizando el modelo de Weibull

    Directory of Open Access Journals (Sweden)

    Carlos García Mogollón

    2010-07-01

    Full Text Available La guayaba (Psidium guajava L. es una fruta tropical susceptible de sufrir alteraciones indeseables que afectan su vida útil debido a condiciones inadecuadas de almacenamiento y acondicionamiento. En este trabajo se estimó la vida útil de guayaba en fresco utilizando el modelo probabilístico de Weibull y se valuó la calidad de los frutos durante almacenamiento en diferentes condiciones de temperatura y empaque. La evaluación poscosecha se hizo por 15 días con guayabas variedad regional roja. Se utilizó un dise&#ntilde;o completamente al azar con arreglo factorial consistente en tres factores: tiempo de almacenamiento con seis niveles (0, 3, 6, 9, 12 y 15 días, temperatura de almacenamiento con dos niveles: ambiente (37 °C y humedad relativa (HR entre 85 y 90% y refrigeración (9±2 °C y HR de 85 - 90%; dos tipo de empaques: bandeja de poliestireno con film plástico de PVC y 'foil' de aluminio. Durante la evaluación sensorial en el periodo de almacenamiento se usó una escala estructurada de tres puntos grado de satisfacción. El modelo de Weibull demostró ser adecuado para predecir la vida útil de la guayaba fresca basados en los criterios de ajustes, límites de confianza de aceptación y fallo. Durante el periodo de almacenamiento se observó que el factor tiempo, la temperatura y el tipo de empaque tienen un efecto estadístico significativo (P < 0.05 sobre el diámetro equivalente, esfericidad, masa específica aparente, SST, pH, acidez y evaluación sensorial de los frutos. El producto puede ser consumido como fruta fresca hasta diez días de almacenamiento a temperatura ambiente y máximo quince días en almacenamiento refrigerado.

  2. A modified Weibull model for growth and survival of Listeria innocua and Salmonella Typhimurium in chicken breasts during refrigerated and frozen storage.

    Science.gov (United States)

    Pradhan, A K; Li, M; Li, Y; Kelso, L C; Costello, T A; Johnson, M G

    2012-06-01

    The potential of food-borne pathogens to survive and grow during refrigerated and frozen storage has raised serious concerns over the safety of stored poultry products. In this study, the effect of refrigeration and freezing temperatures (-20, -12, 0, 4, and 8°C) on growth and survival of Listeria innocua and Salmonella enterica serovar Typhimurium in raw chicken breasts for storage times of 3, 7, 10, 14, and 21 d were investigated. A modified Weibull model was also developed to analyze the microbial behavior of both microorganisms in raw chicken breasts under different refrigerated storage conditions over time. The results showed that the bacterial loads of L. innocua at 4 and 8°C and Salmonella Typhimurium at 8°C were significantly different (P innocua at 4 and 8°C was 2.1 log cfu/g and 3.7 log cfu/g, respectively, and that of Salmonella Typhimurium at 8°C was 1.2 log cfu/g. The root mean square errors, median relative error, mean absolute relative error, and the plot of predicted versus observed bacterial loads showed a good performance of the model. The results from this study provided useful information regarding the behavior of Listeria and Salmonella in raw chicken breast meat during refrigerated and frozen storage, which would be helpful in giving insight over the safety of poultry products storage.

  3. Life prediction for a vacuum fluorescent display based on two improved models using the three-parameter Weibull right approximation method.

    Science.gov (United States)

    Zhang, Jianping; Zhang, Xing; Zong, Yu; Pan, Yaofang; Wu, Helen; Tang, Jieshuo

    2018-02-01

    To obtain precise life information for vacuum fluorescent displays (VFDs), luminance degradation data for VFDs were collected from a group of normal life tests. Instead of exponential function, the three-parameter Weibull right approximation method (TPWRAM) was applied to describe the luminance degradation path of optoelectronic products, and two improved models were established. One of these models calculated the average life by fitting average luminance degradation data, and the other model obtained VFD life by combining the approximation method with luminance degradation test data from each individual sample. The results indicated that the test design under normal working stress was appropriate, and the selection of censored test data was simple. The two models improved by TPWRAM both revealed the luminance decaying law for VFD, and the pseudo failure time was accurately extrapolated. It was further confirmed by comparing relative error that using the second model gave a more accurate prediction of VFD life. The improved models in this study can provide technical references for researchers and manufacturers in aspects of life prediction methodology for its development. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Handbook of exponential and related distributions for engineers and scientists

    CERN Document Server

    Pal, Nabendu; Lim, Wooi K

    2005-01-01

    The normal distribution is widely known and used by scientists and engineers. However, there are many cases when the normal distribution is not appropriate, due to the data being skewed. Rather than leaving you to search through journal articles, advanced theoretical monographs, or introductory texts for alternative distributions, the Handbook of Exponential and Related Distributions for Engineers and Scientists provides a concise, carefully selected presentation of the properties and principles of selected distributions that are most useful for application in the sciences and engineering.The book begins with all the basic mathematical and statistical background necessary to select the correct distribution to model real-world data sets. This includes inference, decision theory, and computational aspects including the popular Bootstrap method. The authors then examine four skewed distributions in detail: exponential, gamma, Weibull, and extreme value. For each one, they discuss general properties and applicabi...

  5. SMALL-SCALE AND GLOBAL DYNAMOS AND THE AREA AND FLUX DISTRIBUTIONS OF ACTIVE REGIONS, SUNSPOT GROUPS, AND SUNSPOTS: A MULTI-DATABASE STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.; Longcope, Dana W. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Senkpeil, Ryan R. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Tlatov, Andrey G. [Kislovodsk Mountain Astronomical Station of the Pulkovo Observatory, Kislovodsk 357700 (Russian Federation); Nagovitsyn, Yury A. [Pulkovo Astronomical Observatory, Russian Academy of Sciences, St. Petersburg 196140 (Russian Federation); Pevtsov, Alexei A. [National Solar Observatory, Sunspot, NM 88349 (United States); Chapman, Gary A.; Cookson, Angela M. [San Fernando Observatory, Department of Physics and Astronomy, California State University Northridge, Northridge, CA 91330 (United States); Yeates, Anthony R. [Department of Mathematical Sciences, Durham University, South Road, Durham DH1 3LE (United Kingdom); Watson, Fraser T. [National Solar Observatory, Tucson, AZ 85719 (United States); Balmaceda, Laura A. [Institute for Astronomical, Terrestrial and Space Sciences (ICATE-CONICET), San Juan (Argentina); DeLuca, Edward E. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Martens, Petrus C. H., E-mail: munoz@solar.physics.montana.edu [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30303 (United States)

    2015-02-10

    In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)

  6. The stochastic distribution of available coefficient of friction on quarry tiles for human locomotion.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.

  7. Changes of Probability Distributions in Tsunami Heights with Fault Parameters

    Science.gov (United States)

    Kim, Kwan-Hyuck; Kwon, Hyun-Han; Park, Yong Sung; Cho, Yong-Sik

    2017-04-01

    This study explored the changes of the probability distribution in tsunami heights along the eastern coastline of the Korea for virtual earthquakes. The results confirmed that the changes of the probability distribution in tsunami heights depending on tsunami fault parameters was found. A statistical model was developed in order to jointly analyse tsunami heights on a variety of events by regarding the functional relationships; the parameters in a Weibull distribution with earthquake characteristics could be estimated, all within a Bayesian regression framework. The proposed model could be effective and informative for the estimation of tsunami risk from an earthquake of a given magnitude at a particular location. Definitely, the coefficient of determination between the true and estimated values for Weibull distribution parameters were over 90% for both virtual and historical tsunami. Keywords: Tsunami heights, Bayesian model, Regression analysis, Risk analysis Acknowledgements This research was supported by a grant from Study on Solitary Wave Run-up for Hazard Mitigation of Coastal Communities against Sea Level Rise Project[No. 20140437] funded by Korea Institute of Marine Science and Technology promotion.

  8. Optimal power flow for distribution networks with distributed generation

    Directory of Open Access Journals (Sweden)

    Radosavljević Jordan

    2015-01-01

    Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046

  9. An all-timescales rainfall probability distribution

    Science.gov (United States)

    Papalexiou, S. M.; Koutsoyiannis, D.

    2009-04-01

    The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.

  10. Determining the distribution of fitness effects using a generalized Beta-Burr distribution.

    Science.gov (United States)

    Joyce, Paul; Abdo, Zaid

    2017-07-12

    In Beisel et al. (2007), a likelihood framework, based on extreme value theory (EVT), was developed for determining the distribution of fitness effects for adaptive mutations. In this paper we extend this framework beyond the extreme distributions and develop a likelihood framework for testing whether or not extreme value theory applies. By making two simple adjustments to the Generalized Pareto Distribution (GPD) we introduce a new simple five parameter probability density function that incorporates nearly every common (continuous) probability model ever used. This means that all of the common models are nested. This has important implications in model selection beyond determining the distribution of fitness effects. However, we demonstrate the use of this distribution utilizing likelihood ratio testing to evaluate alternative distributions to the Gumbel and Weibull domains of attraction of fitness effects. We use a bootstrap strategy, utilizing importance sampling, to determine where in the parameter space will the test be most powerful in detecting deviations from these domains and at what sample size, with focus on small sample sizes (n<20). Our results indicate that the likelihood ratio test is most powerful in detecting deviation from the Gumbel domain when the shape parameters of the model are small while the test is more powerful in detecting deviations from the Weibull domain when these parameters are large. As expected, an increase in sample size improves the power of the test. This improvement is observed to occur quickly with sample size n≥10 in tests related to the Gumbel domain and n≥15 in the case of the Weibull domain. This manuscript is in tribute to the contributions of Dr. Paul Joyce to the areas of Population Genetics, Probability Theory and Mathematical Statistics. A Tribute section is provided at the end that includes Paul's original writing in the first iterations of this manuscript. The Introduction and Alternatives to the GPD sections

  11. Stand diameter distribution modelling and prediction based on Richards function.

    Directory of Open Access Journals (Sweden)

    Ai-guo Duan

    Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  12. On Six-Parameter Frechet Distribution: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Haitham M. Yousof

    2016-06-01

    Full Text Available This paper introduces a new generalization of the transmuted Marshall-Olkin Frechet distribution of A…fy et al. (2015, using Kumaraswamy generalized family. The new model is referred to as Kumaraswamy transmuted Marshall-Olkin FrØchet distribution. This model contains sixty two sub-models as special cases such as the Kumaraswamy transmuted Frechet, Kumaraswamy transmuted Marshall-Olkin, generalized inverse Weibull and Kumaraswamy Gumbel type II distributions, among others. Various mathematical properties of the proposed distribution including closed forms for ordinary and incomplete moments, quantile and generating functions and Renyi and -entropies are derived. The unknown parameters of the new distribution are estimated using the maximum likelihood estimation. We illustrate the importance of the new model by means of two applications to real data sets.

  13. Modeling of speed distribution for mixed bicycle traffic flow

    Directory of Open Access Journals (Sweden)

    Cheng Xu

    2015-11-01

    Full Text Available Speed is a fundamental measure of traffic performance for highway systems. There were lots of results for the speed characteristics of motorized vehicles. In this article, we studied the speed distribution for mixed bicycle traffic which was ignored in the past. Field speed data were collected from Hangzhou, China, under different survey sites, traffic conditions, and percentages of electric bicycle. The statistics results of field data show that the total mean speed of electric bicycles is 17.09 km/h, 3.63 km/h faster and 27.0% higher than that of regular bicycles. Normal, log-normal, gamma, and Weibull distribution models were used for testing speed data. The results of goodness-of-fit hypothesis tests imply that the log-normal and Weibull model can fit the field data very well. Then, the relationships between mean speed and electric bicycle proportions were proposed using linear regression models, and the mean speed for purely electric bicycles or regular bicycles can be obtained. The findings of this article will provide effective help for the safety and traffic management of mixed bicycle traffic.

  14. Generalized Extreme Value Distribution Models for the Assessment of Seasonal Wind Energy Potential of Debuncha, Cameroon

    Directory of Open Access Journals (Sweden)

    Nkongho Ayuketang Arreyndip

    2016-01-01

    Full Text Available The method of generalized extreme value family of distributions (Weibull, Gumbel, and Frechet is employed for the first time to assess the wind energy potential of Debuncha, South-West Cameroon, and to study the variation of energy over the seasons on this site. The 29-year (1983–2013 average daily wind speed data over Debuncha due to missing values in the years 1992 and 1994 is gotten from NASA satellite data through the RETScreen software tool provided by CANMET Canada. The data is partitioned into min-monthly, mean-monthly, and max-monthly data and fitted using maximum likelihood method to the two-parameter Weibull, Gumbel, and Frechet distributions for the purpose of determining the best fit to be used for assessing the wind energy potential on this site. The respective shape and scale parameters are estimated. By making use of the P values of the Kolmogorov-Smirnov statistic (K-S and the standard error (s.e analysis, the results show that the Frechet distribution best fits the min-monthly, mean-monthly, and max-monthly data compared to the Weibull and Gumbel distributions. Wind speed distributions and wind power densities of both the wet and dry seasons are compared. The results show that the wind power density of the wet season was higher than in the dry season. The wind speeds at this site seem quite low; maximum wind speeds are listed as between 3.1 and 4.2 m/s, which is below the cut-in wind speed of many modern turbines (6–10 m/s. However, we recommend the installation of low cut-in wind turbines like the Savonius or Aircon (10 KW for stand-alone low energy need.

  15. Utilização da função pearson tipo V, Weibull e hiperbólica para modelagem da distribuição de diâmetros

    Directory of Open Access Journals (Sweden)

    Daniel Henrique Breda Binoti

    2013-09-01

    Full Text Available Objetivou-se neste estudo avaliar a eficiência da função log-Pearson tipo V para a descrição da estrutura diamétrica de povoamentos equiâneos de eucaliptos, bem como propor um modelo de distribuição diamétrica utilizando essa função. A modelagem realizada pela função log-Pearson tipo V foi comparada com a modelagem realizada com a função Weibull e hiperbólica. Para isso utilizou-se dados de parcelas permanentes de eucalipto, localizadas na região centro oeste do estado de Minas Gerais. A função Pearson tipo V foi testada em três diferentes configurações, com três e dois parâmetros, e tendo o parâmetro de locação substituído pelo diâmetro mínimo da parcela. A aderência das funções aos dados foi comprovada pela aplicação do teste Kolmogorov-Sminorv (K-S. Todos os ajustes apresentaram aderência aos dados pelo teste KS. As funções Weibull e hiperbólica apresentaram desempenho superior ao demonstrado pela função Pearson tipo V.

  16. Landscape-structure metrics influence on fire size distribution in Portugal

    Science.gov (United States)

    Gonçalves, N. J.; Pereira, M. G.; Fernandes, P.; Loureiro, C.; DaCamara, C. C.; Calado, M. T.

    2012-04-01

    The spatial patterns of fire frequency are a proxy for the landscape-level mosaic of vegetation composition, structure and loading that is expected to affect wildfire spread and growth. This work aims to assess the role of landscape heterogeneity on fire size distribution in Portugal. The dataset used includes 2,200 fire records with size greater than or equal to 100 ha registered from 1998 to 2008. We used the Portuguese Forest Service digital fire atlas and ArcGis to calculate previous fire recurrence (number of times burned since 1975) for the patches within the fire perimeter, whereby each patch is unique in its fire history in relation to the adjacent patches. Patch Analyst was used to compute the landscape metrics for each fire, which express landscape structure in terms of the composition, complexity and diversity of fire recurrence. Several distribution functions were tested to fit the positively skewed fire size samples with burnt area values above an increasing lower threshold. Estimates of the distribution parameters were obtained based on maximum likelihood method while the ability of each function to fit the empirical distribution was assessed with standard goodness of fit statistical tests (e.g., Kolmogorov-Smirnov, Crámer von-Mises and Anderson-Darling) as well as with probability- or quantile-plots. Results indicate that the Weibull and its truncated version allow an adjustment for all records in the database, and that the truncated Weibull provides the best fit (with the highest p-value). For higher values of the lower threshold (>150 ha), other functions (e.g., gamma) provide a good fit to the data but only for fires larger than 350 ha (n=726) the truncated Weibull is not the best statistical model. Then, several landscape metrics of fire recurrence (e.g., mean, dominant, relative patch richness, area weighted mean shape Index, mean perimeter-area ratio, mean patch fractal dimension, edge density, median patch size, patch density) with expected

  17. Robust D-optimal designs under correlated error, applicable invariantly for some lifetime distributions

    International Nuclear Information System (INIS)

    Das, Rabindra Nath; Kim, Jinseog; Park, Jeong-Soo

    2015-01-01

    In quality engineering, the most commonly used lifetime distributions are log-normal, exponential, gamma and Weibull. Experimental designs are useful for predicting the optimal operating conditions of the process in lifetime improvement experiments. In the present article, invariant robust first-order D-optimal designs are derived for correlated lifetime responses having the above four distributions. Robust designs are developed for some correlated error structures. It is shown that robust first-order D-optimal designs for these lifetime distributions are always robust rotatable but the converse is not true. Moreover, it is observed that these designs depend on the respective error covariance structure but are invariant to the above four lifetime distributions. This article generalizes the results of Das and Lin [7] for the above four lifetime distributions with general (intra-class, inter-class, compound symmetry, and tri-diagonal) correlated error structures. - Highlights: • This paper presents invariant robust first-order D-optimal designs under correlated lifetime responses. • The results of Das and Lin [7] are extended for the four lifetime (log-normal, exponential, gamma and Weibull) distributions. • This paper also generalizes the results of Das and Lin [7] to more general correlated error structures

  18. Extreme value theory (EVT) application on estimating the distribution of maxima

    Science.gov (United States)

    Ramadhani, F. A.; Nurrohmah, S.; Novita, M.

    2017-07-01

    Extreme Value Theory (EVT) has emerged as one of the most important statistical theories for the applied sciences. EVT provides a firm theoretical foundation for building a statistical model describing extreme events. The feature that distinguish extreme value analysis than other statistical analysis is the ability to quantify the behavior of unusually large values even when those values are scarce. One of the key results from EVT is the ability to estimate the distribution of maximum value, that usually called as maxima, using the asymptotic argument. In order to build such models, the Fisher-Tippett theorem which specifies the form of the limit distribution for transformed maxima will be greatly used. Furthermore, it can be shown that there are only three families of possible limit laws for distribution of maxima, which are the Gumbel, Frechet, and Weibull distributions. These three distributions can be expressed in a single distribution function called the generalized extreme value (GEV) distribution.

  19. Cell-size distribution and scaling in a one-dimensional Kolmogorov-Johnson-Mehl-Avrami lattice model with continuous nucleation

    Science.gov (United States)

    Néda, Zoltán; Járai-Szabó, Ferenc; Boda, Szilárd

    2017-10-01

    The Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth model is considered on a one-dimensional (1D) lattice. Cells can grow with constant speed and continuously nucleate on the empty sites. We offer an alternative mean-field-like approach for describing theoretically the dynamics and derive an analytical cell-size distribution function. Our method reproduces the same scaling laws as the KJMA theory and has the advantage that it leads to a simple closed form for the cell-size distribution function. It is shown that a Weibull distribution is appropriate for describing the final cell-size distribution. The results are discussed in comparison with Monte Carlo simulation data.

  20. Fitting Statistical Distributions Functions on Ozone Concentration Data at Coastal Areas

    International Nuclear Information System (INIS)

    Muhammad Yazid Nasir; Nurul Adyani Ghazali; Muhammad Izwan Zariq Mokhtar; Norhazlina Suhaimi

    2016-01-01

    Ozone is known as one of the pollutant that contributes to the air pollution problem. Therefore, it is important to carry out the study on ozone. The objective of this study is to find the best statistical distribution for ozone concentration. There are three distributions namely Inverse Gaussian, Weibull and Lognormal were chosen to fit one year hourly average ozone concentration data in 2010 at Port Dickson and Port Klang. Maximum likelihood estimation (MLE) method was used to estimate the parameters to develop the probability density function (PDF) graph and cumulative density function (CDF) graph. Three performance indicators (PI) that are normalized absolute error (NAE), prediction accuracy (PA), and coefficient of determination (R 2 ) were used to determine the goodness-of-fit criteria of the distribution. Result shows that Weibull distribution is the best distribution with the smallest error measure value (NAE) at Port Klang and Port Dickson is 0.08 and 0.31, respectively. The best score for highest adequacy measure (PA: 0.99) with the value of R 2 is 0.98 (Port Klang) and 0.99 (Port Dickson). These results provide useful information to local authorities for prediction purpose. (author)

  1. Statistical Evidence for the Preference of Frailty Distributions with Regularly-Varying-at-Zero Densities

    DEFF Research Database (Denmark)

    Missov, Trifon I.; Schöley, Jonas

    Missov and Finkelstein (2011) prove an Abelian and its corresponding Tauberian theorem regarding distributions for modeling unobserved heterogeneity in fixed-frailty mixture models. The main property of such distributions is the regular variation at zero of their densities. According...... to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...

  2. Estimativa dos parâmetros da função de densidade probabilística de weibull por regressão aninhada em povoamento desbastado de Pinus taeda L.

    Directory of Open Access Journals (Sweden)

    Paulo Renato Schneider

    2008-01-01

    Full Text Available This work was accomplished with the objective of predicting the parameters of probabilistic density function for thinning stands of Pinus taeda, in the south of Brazil, with the purpose of obtaining the prognosis of the frequencies per unit of area by diameter class. The parameter from Weibull function was estimate through nested regressions with independent variables that express the density of population. Results of the adjustment of the probabilistic density function of the frequencies per diameter class, in the first and second thinning, and cut, presented an excellent statistical precision, with good estimates of values and population density per diameter class. The values prognostic of the probabilistic density per diameter class, considering situations different density and age from the population, showed an excellent precision, with values close to those ones observed.

  3. Distribution analysis of airborne nicotine concentrations in hospitality facilities.

    Science.gov (United States)

    Schorp, Matthias K; Leyden, Donald E

    2002-02-01

    A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector.

  4. Fitting diameter distribution models to data from forest inventories with concentric plot design

    Energy Technology Data Exchange (ETDEWEB)

    Nanos, N.; Sjöstedt de Luna, S.

    2017-11-01

    Aim: Several national forest inventories use a complex plot design based on multiple concentric subplots where smaller diameter trees are inventoried when lying in the smaller-radius subplots and ignored otherwise. Data from these plots are truncated with threshold (truncation) diameters varying according to the distance from the plot centre. In this paper we designed a maximum likelihood method to fit the Weibull diameter distribution to data from concentric plots. Material and methods: Our method (M1) was based on multiple truncated probability density functions to build the likelihood. In addition, we used an alternative method (M2) presented recently. We used methods M1 and M2 as well as two other reference methods to estimate the Weibull parameters in 40000 simulated plots. The spatial tree pattern of the simulated plots was generated using four models of spatial point patterns. Two error indices were used to assess the relative performance of M1 and M2 in estimating relevant stand-level variables. In addition, we estimated the Quadratic Mean plot Diameter (QMD) using Expansion Factors (EFs). Main results: Methods M1 and M2 produced comparable estimation errors in random and cluster tree spatial patterns. Method M2 produced biased parameter estimates in plots with inhomogeneous Poisson patterns. Estimation of QMD using EFs produced biased results in plots within inhomogeneous intensity Poisson patterns. Research highlights:We designed a new method to fit the Weibull distribution to forest inventory data from concentric plots that achieves high accuracy and precision in parameter estimates regardless of the within-plot spatial tree pattern.

  5. Modelling Wind for Wind Farm Layout Optimization Using Joint Distribution of Wind Speed and Wind Direction

    DEFF Research Database (Denmark)

    Feng, Ju; Shen, Wen Zhong

    2015-01-01

    Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint distribu......Reliable wind modelling is of crucial importance for wind farm development. The common practice of using sector-wise Weibull distributions has been found inappropriate for wind farm layout optimization. In this study, we propose a simple and easily implementable method to construct joint...... quite well in terms of the coefficient of determination R-2. Then, the best of these joint distributions is used in the layout optimization of the Horns Rev 1 wind farm and the choice of bin sizes for wind speed and wind direction is also investigated. It is found that the choice of bin size for wind...... direction is especially critical for layout optimization and the recommended choice of bin sizes for wind speed and wind direction is finally presented....

  6. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  7. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Multi-objective and Perishable Fuzzy Inventory Models Having Weibull Life-time With Time Dependent Demand, Demand Dependent Production and Time Varying Holding Cost: A Possibility/Necessity Approach

    Science.gov (United States)

    Pathak, Savita; Mondal, Seema Sarkar

    2010-10-01

    A multi-objective inventory model of deteriorating item has been developed with Weibull rate of decay, time dependent demand, demand dependent production, time varying holding cost allowing shortages in fuzzy environments for non- integrated and integrated businesses. Here objective is to maximize the profit from different deteriorating items with space constraint. The impreciseness of inventory parameters and goals for non-integrated business has been expressed by linear membership functions. The compromised solutions are obtained by different fuzzy optimization methods. To incorporate the relative importance of the objectives, the different cardinal weights crisp/fuzzy have been assigned. The models are illustrated with numerical examples and results of models with crisp/fuzzy weights are compared. The result for the model assuming them to be integrated business is obtained by using Generalized Reduced Gradient Method (GRG). The fuzzy integrated model with imprecise inventory cost is formulated to optimize the possibility necessity measure of fuzzy goal of the objective function by using credibility measure of fuzzy event by taking fuzzy expectation. The results of crisp/fuzzy integrated model are illustrated with numerical examples and results are compared.

  9. Multi-choice stochastic transportation problem involving general form of distributions.

    Science.gov (United States)

    Quddoos, Abdul; Ull Hasan, Md Gulzar; Khalid, Mohammad Masood

    2014-01-01

    Many authors have presented studies of multi-choice stochastic transportation problem (MCSTP) where availability and demand parameters follow a particular probability distribution (such as exponential, weibull, cauchy or extreme value). In this paper an MCSTP is considered where availability and demand parameters follow general form of distribution and a generalized equivalent deterministic model (GMCSTP) of MCSTP is obtained. It is also shown that all previous models obtained by different authors can be deduced with the help of GMCSTP. MCSTP with pareto, power function or burr-XII distributions are also considered and equivalent deterministic models are obtained. To illustrate the proposed model two numerical examples are presented and solved using LINGO 13.0 software package.

  10. A distributed delay approach for modeling delayed outcomes in pharmacokinetics and pharmacodynamics studies.

    Science.gov (United States)

    Hu, Shuhua; Dunlavey, Michael; Guzy, Serge; Teuscher, Nathan

    2018-04-01

    A distributed delay approach was proposed in this paper to model delayed outcomes in pharmacokinetics and pharmacodynamics studies. This approach was shown to be general enough to incorporate a wide array of pharmacokinetic and pharmacodynamic models as special cases including transit compartment models, effect compartment models, typical absorption models (either zero-order or first-order absorption), and a number of atypical (or irregular) absorption models (e.g., parallel first-order, mixed first-order and zero-order, inverse Gaussian, and Weibull absorption models). Real-life examples were given to demonstrate how to implement distributed delays in Phoenix ® NLME™ 8.0, and to numerically show the advantages of the distributed delay approach over the traditional methods.

  11. The Research and Analysis on Failure Distribution Model of Diesel Engine Component Parts

    Directory of Open Access Journals (Sweden)

    Wang Shaokun

    2016-01-01

    Full Text Available Reliability research not only provides the direction of quality improvement for new product research and development, but also helps to get the product failure distribution model, so as to ensure the logistics organization to improve the service level. According to the reliability data from the database of research group, we analyzed the rule of failure distribution of high frequency fault component (fuel injection pump and built a Weibull model. The parameters of the model are estimated and solved by using the uniform linear method and the least square method. After solving the model, the density function curves and the failure rate curves are drawn, and then the model of the demand forecasting of spare parts based on the failure distribution is obtained.

  12. Fracture strength and Weibull analysis of Ba{sub 0.5}Sr{sub 0.5}Co{sub 0.8}Fe{sub 0.2}O{sub 3−δ} oxygen transport membranes evaluated by biaxial and uniaxial bending tests

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li, E-mail: li.wang-5@postgrad.manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Technology and Engineering Centre for Space Utilisation, Chinese Academy of Science, Beijing 10094 (China); Dou, Rui; Wang, Gong [Technology and Engineering Centre for Space Utilisation, Chinese Academy of Science, Beijing 10094 (China); Li, Yizhe; Bai, Mingwen; Hall, David [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Chen, Ying, E-mail: ying.chen-2@manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom)

    2016-07-18

    The present study evaluates the fracture strengths and the Weibull modulus of Ba{sub 0.5}Sr{sub 0.5}Co{sub 0.8}Fe{sub 0.2}O{sub 3−δ} (BSCF) oxygen transport membranes by means of biaxial and uniaxial bending tests at both room temperature (RT) and 800 °C. The fracture strengths obtained from the biaxial bending tests are much lower than those obtained from the uniaxial bending tests while Weibull moduli (m) are similar. By utilising Weibull statistics the uniaxial strengths can be predicted from the biaxial values at both RT and 800 °C. Fracture surfaces at both RT and 800 °C show only a transgranular fracture mode. Failure origins are also determined by scanning electron microscope (SEM) based on the fractographic principles. Most defects determining the fracture strength of this particular material are found to be pores with a relatively large size.

  13. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  14. Probability distribution of surface wind speed induced by convective adjustment on Venus

    Science.gov (United States)

    Yamamoto, Masaru

    2017-03-01

    The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.

  15. Tailoring speckles with Weibull intensity statistics

    Science.gov (United States)

    Amaral, João P.; Fonseca, Eduardo J. S.; Jesus-Silva, Alcenisio J.

    2015-12-01

    We use a phase-only computer-generated hologram to encode both phase and amplitude of a power of Rayleigh speckles. This method allows us to generate speckles with enhanced and reduced contrast without any optimization process. We explore non-Rayleigh speckles and unveil, theoretically and experimentally, their first-order statistical properties. These speckles may find applications in syntheses of disordered optical potentials for cold atoms and colloidal particles, in speckle illumination imaging, and in wave interference studied through spatial intensity correlation.

  16. A Comparison of Airborne Laser Scanning and Image Point Cloud Derived Tree Size Class Distribution Models in Boreal Ontario

    Directory of Open Access Journals (Sweden)

    Margaret Penner

    2015-11-01

    Full Text Available Airborne Laser Scanning (ALS metrics have been used to develop area-based forest inventories; these metrics generally include estimates of stand-level, per hectare values and mean tree attributes. Tree-based ALS inventories contain desirable information on individual tree dimensions and how much they vary within a stand. Adding size class distribution information to area-based inventories helps to bridge the gap between area- and tree-based inventories. This study examines the potential of ALS and stereo-imagery point clouds to predict size class distributions in a boreal forest. With an accurate digital terrain model, both ALS and imagery point clouds can be used to estimate size class distributions with comparable accuracy. Nonparametric imputations were generally superior to parametric imputations; this may be related to the limitation of using a unimodal Weibull function on a relatively small prediction unit (e.g., 400 m2.

  17. Distribuição de frequência da chuva para região Centro-Sul do Ceará, Brasil Frequency distribution of rainfall for the South-Central region of Ceará, Brazil

    Directory of Open Access Journals (Sweden)

    Ítalo Nunes Silva

    2013-09-01

    Full Text Available Foram analisadas sete distribuições de probabilidade Exponencial, Gama, Log-normal, Normal, Weibull, Gumbel e Beta para a chuva mensal e anual na região Centro-Sul do Ceará, Brasil. Para verificação dos ajustes dos dados às funções densidade de probabilidade foi utilizado o teste não-paramétrico de Kolmogorov-Smirnov com nível de 5% de significância. Os dados de chuva foram obtidos da base de dados da SUDENE registrados durante o período de 1913 a 1989. Para a chuva total anual teve ajuste satisfatório dos dados às distribuições Gama, Gumbel, Normal e Weibull e não ocorreu ajuste às distribuições Exponencial, Log-normal e Beta. Recomenda-se o uso da distribuição Normal para estimar valores de chuva provável anual para a região, por ser um procedimento de fácil aplicação e também pelo bom desempenho nos testes. A distribuição de frequência Gumbel foi a que melhor representou os dados de chuva para o período mensal, com o maior número de ajustes no período chuvoso. No período seco os dados de chuva foram melhores representados pela distribuição Exponencial.Seven probability distributions were analysed: Exponential, Gamma, Log-Normal, Normal, Weibull, Gumbel and Beta, for monthly and annual rainfall in the south-central region of Ceará, Brazil. In order to verify the adjustments of the data to the probability density functions, the non-parametric Kolmogorov-Smirnov test was used with a 5% level of significance. The rainfall data were obtained from the database at SUDENE, recorded from 1913 to 1989. For the total annual rainfall, adjustment of the data to the Gamma, Gumbel, Normal and Weibull distributions was satisfactory, and there was no adjustment to the Exponential, Log-normal and Beta distributions. Use of Normal distribution is recommended to estimate the values of probable annual rainfall in the region, this being a procedure of easy application, performing well in the tests. The Gumbel frequency

  18. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  19. Comprehensive evaluation of wind speed distribution models: A case study for North Dakota sites

    International Nuclear Information System (INIS)

    Zhou Junyi; Erdem, Ergin; Li Gong; Shi Jing

    2010-01-01

    Accurate analysis of long term wind data is critical to the estimation of wind energy potential for a candidate location and its nearby area. Investigating the wind speed distribution is one critical task for this purpose. This paper presents a comprehensive evaluation on probability density functions for the wind speed data from five representative sites in North Dakota. Besides the popular Weibull and Rayleigh distributions, we also include other distributions such as gamma, lognormal, inverse Gaussian, and maximum entropy principle (MEP) derived probability density functions (PDFs). Six goodness-of-fit (GOF) statistics are used to determine the appropriate distributions for the wind speed data for each site. It is found that no particular distribution outperforms others for all five sites, while Rayleigh distribution performs poorly for most of the sites. Similar to other models, the performances of MEP-derived PDFs in fitting wind speed data varies from site to site. Also, the results demonstrate that MEP-derived PDFs are flexible and have the potential to capture other possible distribution patterns of wind speed data. Meanwhile, different GOF statistics may generate inconsistent ranking orders of fit performance among the candidate PDFs. In addition, one comprehensive metric that combines all individual statistics is proposed to rank the overall performance for the chosen statistical distributions.

  20. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  1. Stochastic distribution of the required coefficient of friction for level walking--an in-depth study.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    This study investigated the stochastic distribution of the required coefficient of friction (RCOF) which is a critical element for estimating slip probability. Fifty participants walked under four walking conditions. The results of the Kolmogorov-Smirnov two-sample test indicate that 76% of the RCOF data showed a difference in distribution between both feet for the same participant under each walking condition; the data from both feet were kept separate. The results of the Kolmogorov-Smirnov goodness-of-fit test indicate that most of the distribution of the RCOF appears to have a good match with the normal (85.5%), log-normal (84.5%) and Weibull distributions (81.5%). However, approximately 7.75% of the cases did not have a match with any of these distributions. It is reasonable to use the normal distribution for representation of the RCOF distribution due to its simplicity and familiarity, but each foot had a different distribution from the other foot in 76% of cases. The stochastic distribution of the required coefficient of friction (RCOF) was investigated for use in a statistical model to improve the estimate of slip probability in risk assessment. The results indicate that 85.5% of the distribution of the RCOF appears to have a good match with the normal distribution.

  2. Scaling of strength and lifetime probability distributions of quasibrittle structures based on atomistic fracture mechanics

    Science.gov (United States)

    Bažant, Zdeněk P.; Le, Jia-Liang; Bazant, Martin Z.

    2009-01-01

    The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined. PMID:19561294

  3. Scaling of strength and lifetime probability distributions of quasibrittle structures based on atomistic fracture mechanics.

    Science.gov (United States)

    Bazant, Zdenek P; Le, Jia-Liang; Bazant, Martin Z

    2009-07-14

    The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically theory for the strength cdf of quasibrittle structure is refined by deriving it from fracture mechanics of nanocracks propagating by small, activation-energy-controlled, random jumps through the atomic lattice. This refinement also provides a plausible physical justification of the power law for subcritical creep crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined.

  4. Carbon dioxide at an unpolluted site analysed with the smoothing kernel method and skewed distributions.

    Science.gov (United States)

    Pérez, Isidro A; Sánchez, M Luisa; García, M Ángeles; Pardo, Nuria

    2013-07-01

    CO₂ concentrations recorded for two years using a Picarro G1301 analyser at a rural site were studied applying two procedures. Firstly, the smoothing kernel method, which to date has been used with one linear and another circular variable, was used with pairs of circular variables: wind direction, time of day, and time of year, providing that the daily cycle was the prevailing cyclical evolution and that the highest concentrations were justified by the influence of one nearby city source, which was only revealed by directional analysis. Secondly, histograms were obtained, and these revealed most observations to be located between 380 and 410 ppm, and that there was a sharp contrast during the year. Finally, histograms were fitted to 14 distributions, the best known using analytical procedures, and the remainder using numerical procedures. RMSE was used as the goodness of fit indicator to compare and select distributions. Most functions provided similar RMSE values. However, the best fits were obtained using numerical procedures due to their greater flexibility, the triangular distribution being the simplest function of this kind. This distribution allowed us to identify directions and months of noticeable CO₂ input (SSE and April-May, respectively) as well as the daily cycle of the distribution symmetry. Among the functions whose parameters were calculated using an analytical expression, Erlang distributions provided satisfactory fits for monthly analysis, and gamma for the rest. By contrast, the Rayleigh and Weibull distributions gave the worst RMSE values. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Distributed Visualization

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...

  6. Dyadic distributions

    International Nuclear Information System (INIS)

    Golubov, B I

    2007-01-01

    On the basis of the concept of pointwise dyadic derivative dyadic distributions are introduced as continuous linear functionals on the linear space D d (R + ) of infinitely differentiable functions compactly supported by the positive half-axis R + together with all dyadic derivatives. The completeness of the space D' d (R + ) of dyadic distributions is established. It is shown that a locally integrable function on R + generates a dyadic distribution. In addition, the space S d (R + ) of infinitely dyadically differentiable functions on R + rapidly decreasing in the neighbourhood of +∞ is defined. The space S' d (R + ) of dyadic distributions of slow growth is introduced as the space of continuous linear functionals on S d (R + ). The completeness of the space S' d (R + ) is established; it is proved that each integrable function on R + with polynomial growth at +∞ generates a dyadic distribution of slow growth. Bibliography: 25 titles.

  7. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  8. Momentum distributions

    International Nuclear Information System (INIS)

    Simmons, R.O.

    1984-01-01

    The content of the portion of the workshop concerned with momentum distributions in condensed matter is outlined and the neutron scattering approach to their measurement is briefly described. Results concerning helium systems are reviewed. Some theoretical aspects are briefly mentioned

  9. A hierarchical modeling approach for estimating national distributions of chemicals in public drinking water systems.

    Science.gov (United States)

    Qian, Song S; Schulman, Andrew; Koplos, Jonathan; Kotros, Alison; Kellar, Penny

    2004-02-15

    Water quality studies often include the analytical challenge of incorporating censored data and quantifying error of estimation. Many analytical methods exist for estimating distribution parameters when censored data are present. This paper presents a Bayesian-based hierarchical model for estimating the national distribution of the mean concentrations of chemicals occurring in U.S. public drinking water systems using fluoride and thallium as examples. The data used are Safe Drinking Water Act compliance monitoring data (with a significant proportion of left-censored data). The model, which assumes log-normality, was evaluated using simulated data sets generated from a series of Weibull distributions to illustrate the robustness of the model. The hierarchical model is easily implemented using the Markov chain Monte Carlo simulation method. In addition, the Bayesian method is able to quantify the uncertainty in the estimated cumulative density function. The estimated fluoride and thallium national distributions are presented. Results from this study can be used to develop prior distributions for future U.S. drinking water regulatory studies of contaminant occurrence.

  10. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study

    Science.gov (United States)

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2015-01-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a

  11. Distributed creativity

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    This book challenges the standard view that creativity comes only from within an individual by arguing that creativity also exists ‘outside’ of the mind or more precisely, that the human mind extends through the means of action into the world. The notion of ‘distributed creativity’ is not commonly...... used within the literature and yet it has the potential to revolutionise the way we think about creativity, from how we define and measure it to what we can practically do to foster and develop creativity. Drawing on cultural psychology, ecological psychology and advances in cognitive science......, this book offers a basic framework for the study of distributed creativity that considers three main dimensions of creative work: sociality, materiality and temporality. Starting from the premise that creativity is distributed between people, between people and objects and across time, the book reviews...

  12. Distributed systems

    CERN Document Server

    Van Steen, Maarten

    2017-01-01

    For this third edition of "Distributed Systems," the material has been thoroughly revised and extended, integrating principles and paradigms into nine chapters: 1. Introduction 2. Architectures 3. Processes 4. Communication 5. Naming 6. Coordination 7. Replication 8. Fault tolerance 9. Security A separation has been made between basic material and more specific subjects. The latter have been organized into boxed sections, which may be skipped on first reading. To assist in understanding the more algorithmic parts, example programs in Python have been included. The examples in the book leave out many details for readability, but the complete code is available through the book's Website, hosted at www.distributed-systems.net.

  13. Estimadores não viciados para o tempo médio até a falha e para percentis obtidos do modelo de regressão de Weibull Unbiased estimator for MTTF and quantiles obtained from Weibull’s regression model

    Directory of Open Access Journals (Sweden)

    Linda Lee Ho

    2005-04-01

    Full Text Available Considere um modelo de confiabilidade descrito por uma regressão Weibull cujos parâmetros são estimadores pelo método da máxima verossimilhança. Eles serão utilizados para estimar outras quantidades como tempo médio de falha e quantis que por sua vez desempenham uma função importante numa análise de confiabilidade. O objetivo deste estudo é apresentar melhorias nos estimadores do tempo médio de falha e quantis uma vez que seus estimadores de máxima verossimilhança são viciados principalmente quanto envolve um tamanho de amostra reduzida. O procedimento de reamostragem Bootstrap é proposto para corrigir estes vícios. Através de um estudo de simulação é possível quantificar o vício uma vez que sua determinação analítica é um tanto complicada. Um exemplo ilustra o procedimento proposto.Consider a reliability model described by a Weibull regression, whose parameters are estimated by the maximum likelihood method. These parameters are used to estimate other quantities of interest such as MTTF (mean time to failure and quantiles, which in turn play an important role in a reliability analysis. This paper proposes improvements in MTTF and quantile estimators, whose maximum likelihood estimates (MLEs are biased, particularly in cases involving small sample sizes. A Bootstrap procedure is presented to correct such biases. The proposed procedure is evaluated based on simulations in which the sample size, censoring mechanisms, and percentiles of censored data are varied. These simulations allow one to quantify the bias, since the analytical expression is highly complex. The proposed procedure is illustrated through an example.

  14. Spatial distribution

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...

  15. Statistical theory on the analytical form of cloud particle size distributions

    Science.gov (United States)

    Wu, Wei; McFarquhar, Greg

    2017-11-01

    Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.

  16. Diameter distribution in a Brazilian tropical dry forest domain: predictions for the stand and species.

    Science.gov (United States)

    Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C

    2017-01-01

    Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.

  17. Quasihomogeneous distributions

    CERN Document Server

    von Grudzinski, O

    1991-01-01

    This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.

  18. MAIL DISTRIBUTION

    CERN Multimedia

    J. Ferguson

    2002-01-01

    Following discussions with the mail contractor and Mail Service personnel, an agreement has been reached which permits deliveries to each distribution point to be maintained, while still achieving a large proportion of the planned budget reduction in 2002. As a result, the service will revert to its previous level throughout the Laboratory as rapidly as possible. Outgoing mail will be collected from a single collection point at the end of each corridor. Further discussions are currently in progress between ST, SPL and AS divisions on the possibility of an integrated distribution service for internal mail, stores items and small parcels, which could lead to additional savings from 2003 onwards, without affecting service levels. J. Ferguson AS Division

  19. Distributed SLAM

    Science.gov (United States)

    Binns, Lewis A.; Valachis, Dimitris; Anderson, Sean; Gough, David W.; Nicholson, David; Greenway, Phil

    2002-07-01

    Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.

  20. Distribution switchgear

    CERN Document Server

    Stewart, Stan

    2004-01-01

    Switchgear plays a fundamental role within the power supply industry. It is required to isolate faulty equipment, divide large networks into sections for repair purposes, reconfigure networks in order to restore power supplies and control other equipment.This book begins with the general principles of the Switchgear function and leads on to discuss topics such as interruption techniques, fault level calculations, switching transients and electrical insulation; making this an invaluable reference source. Solutions to practical problems associated with Distribution Switchgear are also included.

  1. Parton Distributions

    CERN Document Server

    Dittmar, M.; Glazov, A.; Moch, S.; Altarelli, G.; Anderson, J.; Ball, R.D.; Beuf, G.; Boonekamp, M.; Burkhardt, H.; Caola, F.; Ciafaloni, M.; Colferai, D.; Cooper-Sarkar, A.; de Roeck, A.; Del Debbio, L.; Feltesse, J.; Gelis, F.; Grebenyuk, J.; Guffanti, A.; Halyol, V.; Latorre, J.I.; Lendermann, V.; Li, G.; Motyka, L.; Petersen, T.; Piccione, A.; Radescu, V.; Rogal, M.; Rojo, J.; Royon, C.; Salam, G.P.; Salek, D.; Stasto, A.M.; Thorne, R.S.; Ubiali, M.; Vermaseren, J.A.M.; Vogt, A.; Watt, G.; White, C.D.

    2009-01-01

    We provide an assessment of the state of the art in various issues related to experimental measurements, phenomenological methods and theoretical results relevant for the determination of parton distribution functions (PDFs) and their uncertainties, with the specific aim of providing benchmarks of different existing approaches and results in view of their application to physics at the LHC. We discuss higher order corrections, we review and compare different approaches to small x resummation, and we assess the possible relevance of parton saturation in the determination of PDFS at HERA and its possible study in LHC processes. We provide various benchmarks of PDF fits, with the specific aim of studying issues of error propagation, non-gaussian uncertainties, choice of functional forms of PDFs, and combination of data from different experiments and different processes. We study the impact of combined HERA (ZEUS-H1) structure function data, their impact on PDF uncertainties, and their implications for the computa...

  2. Numerical and machine learning simulation of parametric distributions of groundwater residence time in streams and wells

    Science.gov (United States)

    Starn, J. J.; Belitz, K.; Carlson, C.

    2017-12-01

    Groundwater residence-time distributions (RTDs) are critical for assessing susceptibility of water resources to contamination. This novel approach for estimating regional RTDs was to first simulate groundwater flow using existing regional digital data sets in 13 intermediate size watersheds (each an average of 7,000 square kilometers) that are representative of a wide range of glacial systems. RTDs were simulated with particle tracking. We refer to these models as "general models" because they are based on regional, as opposed to site-specific, digital data. Parametric RTDs were created from particle RTDs by fitting 1- and 2-component Weibull, gamma, and inverse Gaussian distributions, thus reducing a large number of particle travel times to 3 to 7 parameters (shape, location, and scale for each component plus a mixing fraction) for each modeled area. The scale parameter of these distributions is related to the mean exponential age; the shape parameter controls departure from the ideal exponential distribution and is partly a function of interaction with bedrock and with drainage density. Given the flexible shape and mathematical similarity of these distributions, any of them are potentially a good fit to particle RTDs. The 1-component gamma distribution provided a good fit to basin-wide particle RTDs. RTDs at monitoring wells and streams often have more complicated shapes than basin-wide RTDs, caused in part by heterogeneity in the model, and generally require 2-component distributions. A machine learning model was trained on the RTD parameters using features derived from regionally available watershed characteristics such as recharge rate, material thickness, and stream density. RTDs appeared to vary systematically across the landscape in relation to watershed features. This relation was used to produce maps of useful metrics with respect to risk-based thresholds, such as the time to first exceedance, time to maximum concentration, time above the threshold

  3. Initiation and propagation life distributions of fatigue cracks and the life evaluation in high cycle fatigue of ADI; ADI zai no ko cycle hiro kiretsu hassei shinten jumyo bunpu tokusei to jumyo hyoka

    Energy Technology Data Exchange (ETDEWEB)

    Ochi, Y.; Ishii, A. [University of Electro Communications, Tokyo (Japan); Ogata, T. [Hitachi Metals, Ltd., Tokyo (Japan); Kubota, M. [Kyushu University, Fukuoka (Japan). Faculty of Engineering

    1997-10-15

    Rotating bending fatigue tests were carried out on austempered ductile cast iron (ADI) in order to investigate the statistical properties of life distributions of crack initiation and propagation, and also the evaluation of fatigue life. The results are summarized as follows: (1) The size of crack initiation sites of the material was represented by a Weibull distribution without regarding to the kinds of crack initiation sites such as microshrinkage and graphite grain. The crack initiation life scattered widely, but the scatter became much smaller as soon as the cracks grew. (2) The crack propagation life Nac which was defined as the minimum crack propagation rate showed lower scatter than the crack initation life. (3) The fatigue life of the material was evaluated well by Nac and the propagation rate after Nac. It was clear that the fatigue life of ductile cast iron was goverened by the scatter of Nac. 8 refs., 13 figs., 4 tabs.

  4. ESTIMATION OF THE SCALE PARAMETER FROM THE RAYLEIGH DISTRIBUTION FROM TYPE II SINGLY AND DOUBLY CENSORED DATA

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2009-01-01

    Full Text Available As common as the normal distribution is the Rayleigh distribution which occurs in works on radar, properties of sine wave plus-noise, etc. Rayleigh (1880 derived it from the amplitude of sound resulting from many important sources. The Rayleigh distribution is widely used in communication engineering, reliability analysis and applied statistics. Since the Rayleigh distribution has linearly increasing rate, it is appropriate for components which might not have manufacturing defects but age rapidly with time. Several types of electro-vacum devices have this feature. It is connected with one dimension and two dimensions random walk and is some times referred to as a random walk frequency distribution. It is a special case of Weibull distribution (1951 of wide applicability. It can be easily derived from the bivariate normal distribution with and p = 0. For further application of Rayleigh distribution, we refer to Johnson and Kotz (1994. Adatia (1995 has obtained the best linear unbiased estimator of the Rayleigh scale parameter based on fairly large censored samples. Dyer and Whisend (1973 obtained the BLUE of scale parameter based on type II censored samples for small N = 2(15. With the advance of computer technology it is now possible to obtain BLUE for large samples. Hirai (1978 obtained the estimate of the scale parameter from the Rayleigh distribution singly type II censored from the left side and right side and variances of the scale parameter. In this paper, we estimate the scale parameter of type II singly and doubly censored data from the Rayleigh distribution using Blom’s (1958 unbiased nearly best estimates and compare the efficiency of this estimate with BLUE and MLE.

  5. Handbook of distribution

    International Nuclear Information System (INIS)

    Mo, In Gyu

    1992-01-01

    This book tells of business strategy and distribution innovation, purpose of intelligent distribution, intelligent supply distribution, intelligent production distribution, intelligent sale distribution software for intelligence and future and distribution. It also introduces component technology keeping intelligent distribution such as bar cord, OCR, packing, and intelligent auto-warehouse, system technology, and cases in America, Japan and other countries.

  6. CA-CFAR Adjustment Factor Correction with a priori Knowledge of the Clutter Distribution Shape Parameter

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2017-08-01

    Full Text Available Oceanic and coastal radars operation is affected because the targets information is received mixed with and undesired contribution called sea clutter. Specifically, the popular CA-CFAR processor is incapable of maintaining its design false alarm probability when facing clutter with statistical variations. In opposition to the classic alternative suggesting the use of a fixed adjustment factor, the authors propose a modification of the CA- CFAR scheme where the factor is constantly corrected according on the background signal statistical changes. Mathematically translated as a variation in the shape parameter of the clutter distribution, the background signal changes were simulated through the Weibull, Log-Normal and K distributions, deriving expressions which allow choosing an appropriate factor for each possible statistical state. The investigation contributes to the improvement of radar detection by suggesting the application of an adaptive scheme which assumes the clutter shape parameter is known a priori. The offered mathematical expressions are valid for three false alarm probabilities and several windows sizes, covering also a wide range of clutter conditions.

  7. Histological and Demographic Characteristics of the Distribution of Brain and Central Nervous System Tumors' Sizes: Results from SEER Registries Using Statistical Methods.

    Science.gov (United States)

    Pokhrel, Keshav P; Vovoras, Dimitrios; Tsokos, Chris P

    2012-09-01

    The examination of brain tumor growth and its variability among cancer patients is an important aspect of epidemiologic and medical data. Several studies for tumors of brain interpreted descriptive data, in this study we perform inference in the extent possible, suggesting possible explanations for the differentiation in the survival rates apparent in the epidemiologic data. Population based information from nine registries in the USA are classified with respect to age, gender, race and tumor histology to study tumor size variation. The Weibull and Dagum distributions are fitted to the highly skewed tumor sizes distributions, the parametric analysis of the tumor sizes showed significant differentiation between sexes, increased skewness for both the male and female populations, as well as decreased kurtosis for the black female population. The effect of population characteristics on the distribution of tumor sizes is estimated by quantile regression model and then compared with the ordinary least squares results. The higher quantiles of the distribution of tumor sizes for whites are significantly higher than those of other races. Our model predicted that the effect of age in the lower quantiles of the tumor sizes distribution is negative given the variables race and sex. We apply probability and regression models to explore the effects of demographic and histology types and observe significant racial and gender differences in the form of the distributions. Efforts are made to link tumor size data with available survival rates in relation to other prognostic variables.

  8. Characteristics of service requests and service processes of fire and rescue service dispatch centers: analysis of real world data and the underlying probability distributions.

    Science.gov (United States)

    Krueger, Ute; Schimmelpfeng, Katja

    2013-03-01

    A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.

  9. Impact factor distribution revisited

    Science.gov (United States)

    Huang, Ding-wei

    2017-09-01

    We explore the consistency of a new type of frequency distribution, where the corresponding rank distribution is Lavalette distribution. Empirical data of journal impact factors can be well described. This distribution is distinct from Poisson distribution and negative binomial distribution, which were suggested by previous study. By a log transformation, we obtain a bell-shaped distribution, which is then compared to Gaussian and catenary curves. Possible mechanisms behind the shape of impact factor distribution are suggested.

  10. A methodology to quantify the stochastic distribution of friction coefficient required for level walking.

    Science.gov (United States)

    Chang, Wen-Ruey; Chang, Chien-Chi; Matz, Simon; Lesch, Mary F

    2008-11-01

    The required friction coefficient is defined as the minimum friction needed at the shoe and floor interface to support human locomotion. The available friction is the maximum friction coefficient that can be supported without a slip at the shoe and floor interface. A statistical model was recently introduced to estimate the probability of slip and fall incidents by comparing the available friction with the required friction, assuming that both the available and required friction coefficients have stochastic distributions. This paper presents a methodology to investigate the stochastic distributions of the required friction coefficient for level walking. In this experiment, a walkway with a layout of three force plates was specially designed in order to capture a large number of successful strikes without causing fatigue in participants. The required coefficient of friction data of one participant, who repeatedly walked on this walkway under four different walking conditions, is presented as an example of the readiness of the methodology examined in this paper. The results of the Kolmogorov-Smirnov goodness-of-fit test indicated that the required friction coefficient generated from each foot and walking condition by this participant appears to fit the normal, log-normal or Weibull distributions with few exceptions. Among these three distributions, the normal distribution appears to fit all the data generated with this participant. The average of successful strikes for each walk achieved with three force plates in this experiment was 2.49, ranging from 2.14 to 2.95 for each walking condition. The methodology and layout of the experimental apparatus presented in this paper are suitable for being applied to a full-scale study.

  11. Product Distributions for Distributed Optimization. Chapter 1

    Science.gov (United States)

    Bieniawski, Stefan R.; Wolpert, David H.

    2004-01-01

    With connections to bounded rational game theory, information theory and statistical mechanics, Product Distribution (PD) theory provides a new framework for performing distributed optimization. Furthermore, PD theory extends and formalizes Collective Intelligence, thus connecting distributed optimization to distributed Reinforcement Learning (FU). This paper provides an overview of PD theory and details an algorithm for performing optimization derived from it. The approach is demonstrated on two unconstrained optimization problems, one with discrete variables and one with continuous variables. To highlight the connections between PD theory and distributed FU, the results are compared with those obtained using distributed reinforcement learning inspired optimization approaches. The inter-relationship of the techniques is discussed.

  12. Use of Frequency Distribution Functions to Establish Safe Conditions in Relation to the Foodborne Pathogen Bacillus cereus

    Directory of Open Access Journals (Sweden)

    Begoña Delgado

    2005-01-01

    Full Text Available Minimal processing implementation greatly depends on a detailed knowledge of the effects of preservation factors and their combinations on the spoilage and foodborne pathogenic microorganisms. The effectiveness of mild preservation conditions will become increasingly dependent on a more stochastic approach linking microbial physiological factors with product preservation factors. In this study, the validity of frequency distributions to efficiently describe the inactivation and growth of Bacillus cereus in the presence of natural antimicrobials (essential oils has been studied. For this purpose, vegetative cells were exposed to 0.6 mM of thymol or cymene, obtaining survival curves that were best described by the distribution of Weibull, since a tailing effect was observed. B. cereus was also exposed in a growth medium to a low concentration (0.1 mM of both antimicrobials, separately or combined, and the lag times obtained were fitted to a normal distribution, which allowed a description of dispersion of the start of growth. This allowed a more efficient evaluation of the experimental data to establish safe processing conditions according to accurate parameters and their implementation in risk assessment.

  13. Modelling of extreme minimum rainfall using generalised extreme value distribution for Zimbabwe

    Directory of Open Access Journals (Sweden)

    Delson Chikobvu

    2015-09-01

    Full Text Available We modelled the mean annual rainfall for data recorded in Zimbabwe from 1901 to 2009. Extreme value theory was used to estimate the probabilities of meteorological droughts. Droughts can be viewed as extreme events which go beyond and/or below normal rainfall occurrences, such as exceptionally low mean annual rainfall. The duality between the distribution of the minima and maxima was exploited and used to fit the generalised extreme value distribution (GEVD to the data and hence find probabilities of extreme low levels of mean annual rainfall. The augmented Dickey Fuller test confirmed that rainfall data were stationary, while the normal quantile-quantile plot indicated that rainfall data deviated from the normality assumption at both ends of the tails of the distribution. The maximum likelihood estimation method and the Bayesian approach were used to find the parameters of the GEVD. The Kolmogorov-Smirnov and Anderson-Darling goodnessof- fit tests showed that the Weibull class of distributions was a good fit to the minima mean annual rainfall using the maximum likelihood estimation method. The mean return period estimate of a meteorological drought using the threshold value of mean annual rainfall of 473 mm was 8 years. This implies that if in the year there is a meteorological drought then another drought of the same intensity or greater is expected after 8 years. It is expected that the use of Bayesian inference may better quantify the level of uncertainty associated with the GEVD parameter estimates than with the maximum likelihood estimation method. The Markov chain Monte Carlo algorithm for the GEVD was applied to construct the model parameter estimates using the Bayesian approach. These findings are significant because results based on non-informative priors (Bayesian method and the maximum likelihood method approach are expected to be similar.

  14. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  15. Reactor power distribution monitor

    International Nuclear Information System (INIS)

    Hoizumi, Atsushi.

    1986-01-01

    Purpose: To grasp the margin for the limit value of the power distribution peaking factor inside the reactor under operation by using the reactor power distribution monitor. Constitution: The monitor is composed of the 'constant' file, (to store in-reactor power distributions obtained from analysis), TIP and thermocouple, lateral output distribution calibrating apparatus, axial output distribution synthesizer and peaking factor synthesizer. The lateral output distribution calibrating apparatus is used to make calibration by comparing the power distribution obtained from the thermocouples to the power distribution obtained from the TIP, and then to provide the power distribution lateral peaking factors. The axial output distribution synthesizer provides the power distribution axial peaking factors in accordance with the signals from the out-pile neutron flux detector. These axial and lateral power peaking factors are synthesized with high precision in the three-dimensional format and can be monitored at any time. (Kamimura, M.)

  16. Performance Analysis of Methods for Estimating Weibull Parameters ...

    African Journals Online (AJOL)

    Furthermore, the comparison between the wind speed standard deviation predicted by the proposed models and the measured data showed that the MLM has a smaller relative error of -3.33% on average compared to -11.67% on average for the EPF and -8.86% on average for the GM. As a result, the MLM was precisely ...

  17. TAIL ASYMPTOTICS OF LIGHT-TAILED WEIBULL-LIKE SUMS

    DEFF Research Database (Denmark)

    Asmussen, Soren; Hashorva, Enkelejd; Laub, Patrick J.

    2017-01-01

    We consider sums of n i.i.d. random variables with tails close to exp{-x(beta)} for some beta > 1. Asymptotics developed by Rootzen (1987) and Balkema, Kluppelberg, and Resnick (1993) are discussed from the point of view of tails rather than of densities, using a somewhat different angle, and sup......We consider sums of n i.i.d. random variables with tails close to exp{-x(beta)} for some beta > 1. Asymptotics developed by Rootzen (1987) and Balkema, Kluppelberg, and Resnick (1993) are discussed from the point of view of tails rather than of densities, using a somewhat different angle...

  18. determination of weibull parameters and analysis of wind power

    African Journals Online (AJOL)

    HOD

    for sustainable energy sources. The 2016 International. Energy Agency (IEA) world energy outlook report assess the growth in the renewable energy sector as quite impressive [1]. With respect to wind turbine installations, about 63,135 MW of wind power capacity was added globally in 2015 indicating a 23.2% increase.

  19. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  20. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  1. Distributed security in closed distributed systems

    DEFF Research Database (Denmark)

    Hernandez, Alejandro Mario

    in their design. There should always exist techniques for ensuring that the required security properties are met. This has been thoroughly investigated through the years, and many varied methodologies have come through. In the case of distributed systems, there are even harder issues to deal with. Many approaches...... have been taken towards solving security problems, yet many questions remain unanswered. Most of these problems are related to some of the following facts: distributed systems do not usually have any central controller providing security to the entire system; the system heterogeneity is usually...... reflected in heterogeneous security aims; the software life cycle entails evolution and this includes security expectations; the distribution is useful if the entire system is “open” to new (a priori unknown) interactions; the distribution itself poses intrinsically more complex security-related problems...

  2. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    This paper provides detailed insights into predictability of the entire stock and bond return distribution through the use of quantile regression. This allows us to examine speci…c parts of the return distribution such as the tails or the center, and for a suf…ciently …ne grid of quantiles we can...... trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions...... are predictable as a function of economic state variables. The results are, however, very different for stocks and bonds. The state variables primarily predict only location shifts in the stock return distribution, while they also predict changes in higher-order moments in the bond return distribution. Out...

  3. Drinking Water Distribution Systems

    Science.gov (United States)

    Learn about an overview of drinking water distribution systems, the factors that degrade water quality in the distribution system, assessments of risk, future research about these risks, and how to reduce cross-connection control risk.

  4. Distributed System Control

    National Research Council Canada - National Science Library

    Berea, James

    1997-01-01

    Global control in distributed systems had not been well researched. Control had only been addressed in a limited manner, such as for data-update consistency in distributed, redundant databases or for confidentiality controls...

  5. An Extended Pareto Distribution

    OpenAIRE

    Mohamad Mead

    2014-01-01

    For the first time, a new continuous distribution, called the generalized beta exponentiated Pareto type I (GBEP) [McDonald exponentiated Pareto] distribution, is defined and investigated. The new distribution contains as special sub-models some well-known and not known distributions, such as the generalized beta Pareto (GBP) [McDonald Pareto], the Kumaraswamy exponentiated Pareto (KEP), Kumaraswamy Pareto (KP), beta exponentiated Pareto (BEP), beta Pareto (BP), exponentiated Pareto (EP) and ...

  6. Leadership for Distributed Teams

    NARCIS (Netherlands)

    De Rooij, J.P.G.

    2009-01-01

    The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an

  7. Distributed plot-making

    DEFF Research Database (Denmark)

    Jensen, Lotte Groth; Bossen, Claus

    2016-01-01

    different socio-technical systems (paper-based and electronic patient records). Drawing on the theory of distributed cognition and narrative theory, primarily inspired by the work done within health care by Cheryl Mattingly, we propose that the creation of overview may be conceptualised as ‘distributed plot......-making’. Distributed cognition focuses on the role of artefacts, humans and their interaction in information processing, while narrative theory focuses on how humans create narratives through the plot construction. Hence, the concept of distributed plot-making highlights the distribution of information processing...

  8. Extreme value distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2016-01-01

    The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.

  9. Hierarchical species distribution models

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  10. Weighted Lomax distribution.

    Science.gov (United States)

    Kilany, N M

    2016-01-01

    The Lomax distribution (Pareto Type-II) is widely applicable in reliability and life testing problems in engineering as well as in survival analysis as an alternative distribution. In this paper, Weighted Lomax distribution is proposed and studied. The density function and its behavior, moments, hazard and survival functions, mean residual life and reversed failure rate, extreme values distributions and order statistics are derived and studied. The parameters of this distribution are estimated by the method of moments and the maximum likelihood estimation method and the observed information matrix is derived. Moreover, simulation schemes are derived. Finally, an application of the model to a real data set is presented and compared with some other well-known distributions.

  11. Leadership for Distributed Teams

    OpenAIRE

    De Rooij, J.P.G.

    2009-01-01

    The aim of this dissertation was to study the little examined, yet important issue of leadership for distributed teams. Distributed teams are defined as: “teams of which members are geographically distributed and are therefore working predominantly via mediated communication means on an interdependent task and in realizing a joint goal” (adapted from Bell & Kozlowski, 2002 and Dubé & Paré, 2004). Chapter 1 first presents the outline of the dissertation. Next, several characteristics of distri...

  12. Are Parton Distributions Positive?

    CERN Document Server

    Forte, Stefano; Ridolfi, Giovanni; Altarelli, Guido; Forte, Stefano; Ridolfi, Giovanni

    1999-01-01

    We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution

  13. Are parton distributions positive?

    International Nuclear Information System (INIS)

    Forte, Stefano; Altarelli, Guido; Ridolfi, Giovanni

    1999-01-01

    We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution

  14. Distributed Structure Searchable Toxicity

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Distributed Structure Searchable Toxicity (DSSTox) online resource provides high quality chemical structures and annotations in association with toxicity data....

  15. Electric distribution systems

    CERN Document Server

    Sallam, A A

    2010-01-01

    "Electricity distribution is the penultimate stage in the delivery of electricity to end users. The only book that deals with the key topics of interest to distribution system engineers, Electric Distribution Systems presents a comprehensive treatment of the subject with an emphasis on both the practical and academic points of view. Reviewing traditional and cutting-edge topics, the text is useful to practicing engineers working with utility companies and industry, undergraduate graduate and students, and faculty members who wish to increase their skills in distribution system automation and monitoring."--

  16. Cooling water distribution system

    Science.gov (United States)

    Orr, Richard

    1994-01-01

    A passive containment cooling system for a nuclear reactor containment vessel. Disclosed is a cooling water distribution system for introducing cooling water by gravity uniformly over the outer surface of a steel containment vessel using an interconnected series of radial guide elements, a plurality of circumferential collector elements and collector boxes to collect and feed the cooling water into distribution channels extending along the curved surface of the steel containment vessel. The cooling water is uniformly distributed over the curved surface by a plurality of weirs in the distribution channels.

  17. Distributed Energy Technology Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Distributed Energy Technologies Laboratory (DETL) is an extension of the power electronics testing capabilities of the Photovoltaic System Evaluation Laboratory...

  18. Advanced air distribution

    DEFF Research Database (Denmark)

    Melikov, Arsen Krikor

    2011-01-01

    The aim of total volume air distribution (TVAD) involves achieving uniform temperature and velocity in the occupied zone and environment designed for an average occupant. The supply of large amounts of clean and cool air are needed to maintain temperature and pollution concentration at acceptable....... Ventilation in hospitals is essential to decrease the risk of airborne cross-infection. At present, mixing air distribution at a minimum of 12 ach is used in infection wards. Advanced air distribution has the potential to aid in achieving healthy, comfortable and productive indoor environments at levels...... higher than what can be achieved today with the commonly used total volume air distribution principles....

  19. Sorting a distribution theory

    CERN Document Server

    Mahmoud, Hosam M

    2011-01-01

    A cutting-edge look at the emerging distributional theory of sorting Research on distributions associated with sorting algorithms has grown dramatically over the last few decades, spawning many exact and limiting distributions of complexity measures for many sorting algorithms. Yet much of this information has been scattered in disparate and highly specialized sources throughout the literature. In Sorting: A Distribution Theory, leading authority Hosam Mahmoud compiles, consolidates, and clarifies the large volume of available research, providing a much-needed, comprehensive treatment of the

  20. Modelling the vertical distribution of canopy fuel load using national forest inventory and low-density airbone laser scanning data.

    Directory of Open Access Journals (Sweden)

    Eduardo González-Ferreiro

    Full Text Available The fuel complex variables canopy bulk density and canopy base height are often used to predict crown fire initiation and spread. Direct measurement of these variables is impractical, and they are usually estimated indirectly by modelling. Recent advances in predicting crown fire behaviour require accurate estimates of the complete vertical distribution of canopy fuels. The objectives of the present study were to model the vertical profile of available canopy fuel in pine stands by using data from the Spanish national forest inventory plus low-density airborne laser scanning (ALS metrics. In a first step, the vertical distribution of the canopy fuel load was modelled using the Weibull probability density function. In a second step, two different systems of models were fitted to estimate the canopy variables defining the vertical distributions; the first system related these variables to stand variables obtained in a field inventory, and the second system related the canopy variables to airborne laser scanning metrics. The models of each system were fitted simultaneously to compensate the effects of the inherent cross-model correlation between the canopy variables. Heteroscedasticity was also analyzed, but no correction in the fitting process was necessary. The estimated canopy fuel load profiles from field variables explained 84% and 86% of the variation in canopy fuel load for maritime pine and radiata pine respectively; whereas the estimated canopy fuel load profiles from ALS metrics explained 52% and 49% of the variation for the same species. The proposed models can be used to assess the effectiveness of different forest management alternatives for reducing crown fire hazard.

  1. Smart Distribution Systems

    Directory of Open Access Journals (Sweden)

    Yazhou Jiang

    2016-04-01

    Full Text Available The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. A comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD, is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs is introduced. Future research in a smart distribution environment is proposed.

  2. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  3. Cache Oblivious Distribution Sweeping

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.

    2002-01-01

    We adapt the distribution sweeping method to the cache oblivious model. Distribution sweeping is the name used for a general approach for divide-and-conquer algorithms where the combination of solved subproblems can be viewed as a merging process of streams. We demonstrate by a series of algorith...

  4. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    A simple extension of the CAMAC standard is described which allows distributed intelligence at the crate level. By distributed intelligence is meant that there is more than one source of control in a system. This standard is just now emerging from the NIM Dataway Working Group and its European counterpart. 1 figure

  5. Distributed Operating Systems

    NARCIS (Netherlands)

    Mullender, Sape J.

    1987-01-01

    In the past five years, distributed operating systems research has gone through a consolidation phase. On a large number of design issues there is now considerable consensus between different research groups. In this paper, an overview of recent research in distributed systems is given. In turn, the

  6. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    The CAMAC digital interface standard has served us well since 1969. During this time there have been enormous advances in digital electronics. In particular, low cost microprocessors now make it feasible to consider use of distributed intelligence even in simple data acquisition systems. This paper describes a simple extension of the CAMAC standard which allows distributed intelligence at the crate level

  7. Distributed Operating Systems

    NARCIS (Netherlands)

    Tanenbaum, A.S.; van Renesse, R.

    1985-01-01

    Distributed operating systems have many aspects in common with centralized ones, but they also differ in certain ways. This paper is intended as an introduction to distributed operating systems, and especially to current university research about them. After a discussion of what constitutes a

  8. Evaluating Distributed Timing Constraints

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems.......In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems....

  9. Distributed Language and Dialogism

    DEFF Research Database (Denmark)

    Steffensen, Sune Vork

    2015-01-01

    addresses Linell’s critique of Distributed Language as rooted in biosemiotics and in theories of organism-environment systems. It is argued that Linell’s sense-based approach entails an individualist view of how conspecific Others acquire their status as prominent parts of the sense-maker’s environment......This article takes a starting point in Per Linell’s (2013) review article on the book Distributed Language (Cowley, 2011a) and other contributions to the field of ‘Distributed Language’, including Cowley et al. (2010) and Hodges et al. (2012). The Distributed Language approach is a naturalistic...... and anti-representational approach to language that builds on recent developments in the cognitive sciences. With a starting point in Linell’s discussion of the approach, the article aims to clarify four aspects of a distributed view of language vis-à-vis the tradition of Dialogism, as presented by Linell...

  10. Distributed Control Diffusion

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2007-01-01

    . Programming a modular, self-reconfigurable robot is however a complicated task: the robot is essentially a real-time, distributed embedded system, where control and communication paths often are tightly coupled to the current physical configuration of the robot. To facilitate the task of programming modular....... This approach allows the programmer to dynamically distribute behaviors throughout a robot and moreover provides a partial abstraction over the concrete physical shape of the robot. We have implemented a prototype of a distributed control diffusion system for the ATRON modular, self-reconfigurable robot......, self-reconfigurable robots, we present the concept of distributed control diffusion: distributed queries are used to identify modules that play a specific role in the robot, and behaviors that implement specific control strategies are diffused throughout the robot based on these role assignments...

  11. Distributed Robotics Education

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept...... of a distributed educational system as a valuable tool for introducing students to interactive parallel and distributed processing programming as the foundation for distributed robotics and human-robot interaction development. This is done by providing an educational tool that enables problem representation...... to be changed, related to multirobot control and human-robot interaction control from virtual to physical representation. The proposed system is valuable for bringing a vast number of issues into education – such as parallel programming, distribution, communication protocols, master dependency, connectivity...

  12. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  13. Development of distributed target

    CERN Document Server

    Yu Hai Jun; Li Qin; Zhou Fu Xin; Shi Jin Shui; Ma Bing; Chen Nan; Jing Xiao Bing

    2002-01-01

    Linear introduction accelerator is expected to generate small diameter X-ray spots with high intensity. The interaction of the electron beam with plasmas generated at the X-ray converter will make the spot on target increase with time and debase the X-ray dose and the imaging resolving power. A distributed target is developed which has about 24 pieces of thin 0.05 mm tantalum films distributed over 1 cm. due to the structure adoption, the distributed target material over a large volume decreases the energy deposition per unit volume and hence reduces the temperature of target surface, then reduces the initial plasma formalizing and its expansion velocity. The comparison and analysis with two kinds of target structures are presented using numerical calculation and experiments, the results show the X-ray dose and normalized angle distribution of the two is basically the same, while the surface of the distributed target is not destroyed like the previous block target

  14. Pervasive Electricity Distribution System

    Directory of Open Access Journals (Sweden)

    Muhammad Usman Tahir

    2017-06-01

    Full Text Available Now a days a country cannot become economically strong until and unless it has enough electrical power to fulfil industrial and domestic needs. Electrical power being the pillar of any country’s economy, needs to be used in an efficient way. The same step is taken here by proposing a new system for energy distribution from substation to consumer houses, also it monitors the consumer consumption and record data. Unlike traditional manual Electrical systems, pervasive electricity distribution system (PEDS introduces a fresh perspective to monitor the feeder line status at distribution and consumer level. In this system an effort is taken to address the issues of electricity theft, manual billing, online monitoring of electrical distribution system and automatic control of electrical distribution points. The project is designed using microcontroller and different sensors, its GUI is designed in Labview software.

  15. Distributed Propulsion Vehicles

    Science.gov (United States)

    Kim, Hyun Dae

    2010-01-01

    Since the introduction of large jet-powered transport aircraft, the majority of these vehicles have been designed by placing thrust-generating engines either under the wings or on the fuselage to minimize aerodynamic interactions on the vehicle operation. However, advances in computational and experimental tools along with new technologies in materials, structures, and aircraft controls, etc. are enabling a high degree of integration of the airframe and propulsion system in aircraft design. The National Aeronautics and Space Administration (NASA) has been investigating a number of revolutionary distributed propulsion vehicle concepts to increase aircraft performance. The concept of distributed propulsion is to fully integrate a propulsion system within an airframe such that the aircraft takes full synergistic benefits of coupling of airframe aerodynamics and the propulsion thrust stream by distributing thrust using many propulsors on the airframe. Some of the concepts are based on the use of distributed jet flaps, distributed small multiple engines, gas-driven multi-fans, mechanically driven multifans, cross-flow fans, and electric fans driven by turboelectric generators. This paper describes some early concepts of the distributed propulsion vehicles and the current turboelectric distributed propulsion (TeDP) vehicle concepts being studied under the NASA s Subsonic Fixed Wing (SFW) Project to drastically reduce aircraft-related fuel burn, emissions, and noise by the year 2030 to 2035.

  16. Electric power distribution handbook

    CERN Document Server

    Short, Thomas Allen

    2014-01-01

    Of the ""big three"" components of electrical infrastructure, distribution typically gets the least attention. In fact, a thorough, up-to-date treatment of the subject hasn't been published in years, yet deregulation and technical changes have increased the need for better information. Filling this void, the Electric Power Distribution Handbook delivers comprehensive, cutting-edge coverage of the electrical aspects of power distribution systems. The first few chapters of this pragmatic guidebook focus on equipment-oriented information and applications such as choosing transformer connections,

  17. Annular Flow Distribution test

    Energy Technology Data Exchange (ETDEWEB)

    Kielpinski, A.L. (ed.) (Westinghouse Savannah River Co., Aiken, SC (United States)); Childerson, M.T.; Knoll, K.E.; Manolescu, M.I.; Reed, M.J. (Babcock and Wilcox Co., Alliance, OH (United States). Research Center)

    1990-12-01

    This report documents the Babcock and Wilcox (B W) Annular Flow Distribution testing for the Savannah River Laboratory (SRL). The objective of the Annular Flow Distribution Test Program is to characterize the flow distribution between annular coolant channels for the Mark-22 fuel assembly with the bottom fitting insert (BFI) in place. Flow rate measurements for each annular channel were obtained by establishing hydraulic similarity'' between an instrumented fuel assembly with the BFI removed and a reference'' fuel assembly with the BFI installed. Empirical correlations of annular flow rates were generated for a range of boundary conditions.

  18. Annular Flow Distribution test

    International Nuclear Information System (INIS)

    Kielpinski, A.L.; Childerson, M.T.; Knoll, K.E.; Manolescu, M.I.; Reed, M.J.

    1990-12-01

    This report documents the Babcock and Wilcox (B ampersand W) Annular Flow Distribution testing for the Savannah River Laboratory (SRL). The objective of the Annular Flow Distribution Test Program is to characterize the flow distribution between annular coolant channels for the Mark-22 fuel assembly with the bottom fitting insert (BFI) in place. Flow rate measurements for each annular channel were obtained by establishing ''hydraulic similarity'' between an instrumented fuel assembly with the BFI removed and a ''reference'' fuel assembly with the BFI installed. Empirical correlations of annular flow rates were generated for a range of boundary conditions

  19. Sheaves of Schwartz distributions

    International Nuclear Information System (INIS)

    Damyanov, B.P.

    1991-09-01

    The theory of sheaves is a relevant mathematical language for describing the localization principle, known to be valid for the Schwartz distributions (generalized functions). After introducing some fundamentals of sheaves and the basic facts about distribution spaces, the distribution sheaf D Ω of topological C-vector spaces over an open set Ω in R n is systematically studied. A sheaf D M of distributions on a C ∞ -manifold M is then introduced, following a definition of Hoermander's for its particular elements. Further, a general definition of sheaves on a manifold, that are locally isomorphic to (or, modelled on) a sheaf on R n , in proposed. The sheaf properties of D M are studied and this sheaf is shown to be locally isomorphic to D Ω , as a sheaf of topological vector spaces. (author). 14 refs

  20. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...... in conjunction with informal roles and relationships such as clan-like control inherent in agile development. Overall, the study demonstrates that, if appropriately applied, communication technologies can significantly support distributed, agile practices by allowing concurrent enactment of both formal...

  1. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  2. Distribution System White Papers

    Science.gov (United States)

    EPA worked with stakeholders and developed a series of white papers on distribution system issues ranked of potentially significant public health concern (see list below) to serve as background material for EPA, expert and stakeholder discussions.

  3. DOLIB: Distributed Object Library

    Energy Technology Data Exchange (ETDEWEB)

    D' Azevedo, E.F.

    1994-01-01

    This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.

  4. DOLIB: Distributed Object Library

    Energy Technology Data Exchange (ETDEWEB)

    D`Azevedo, E.F.; Romine, C.H.

    1994-10-01

    This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.

  5. Global Landslide Hazard Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Landslide Hazard Distribution is a 2.5 minute grid of global landslide and snow avalanche hazards based upon work of the Norwegian Geotechnical Institute...

  6. Financing Distributed Generation

    Energy Technology Data Exchange (ETDEWEB)

    Walker, A.

    2001-06-29

    This paper introduces the engineer who is undertaking distributed generation projects to a wide range of financing options. Distributed generation systems (such as internal combustion engines, small gas turbines, fuel cells and photovoltaics) all require an initial investment, which is recovered over time through revenues or savings. An understanding of the cost of capital and financing structures helps the engineer develop realistic expectations and not be offended by the common requirements of financing organizations. This paper discusses several mechanisms for financing distributed generation projects: appropriations; debt (commercial bank loan); mortgage; home equity loan; limited partnership; vendor financing; general obligation bond; revenue bond; lease; Energy Savings Performance Contract; utility programs; chauffage (end-use purchase); and grants. The paper also discusses financial strategies for businesses focusing on distributed generation: venture capital; informal investors (''business angels''); bank and debt financing; and the stock market.

  7. Distributed debugging and tumult

    NARCIS (Netherlands)

    Scholten, Johan; Jansen, P.G.

    1990-01-01

    A description is given of Tumult (Twente university multicomputer) and its operating system, along with considerations about parallel debugging, examples of parallel debuggers, and the proposed debugger for Tumult. Problems related to debugging distributed systems and solutions found in other

  8. Financing Distributed Generation

    International Nuclear Information System (INIS)

    Walker, A.

    2001-01-01

    This paper introduces the engineer who is undertaking distributed generation projects to a wide range of financing options. Distributed generation systems (such as internal combustion engines, small gas turbines, fuel cells and photovoltaics) all require an initial investment, which is recovered over time through revenues or savings. An understanding of the cost of capital and financing structures helps the engineer develop realistic expectations and not be offended by the common requirements of financing organizations. This paper discusses several mechanisms for financing distributed generation projects: appropriations; debt (commercial bank loan); mortgage; home equity loan; limited partnership; vendor financing; general obligation bond; revenue bond; lease; Energy Savings Performance Contract; utility programs; chauffage (end-use purchase); and grants. The paper also discusses financial strategies for businesses focusing on distributed generation: venture capital; informal investors (''business angels''); bank and debt financing; and the stock market

  9. Diphoton generalized distribution amplitudes

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.

    2008-01-01

    We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.

  10. Distributed password cracking

    OpenAIRE

    Crumpacker, John R.

    2009-01-01

    Approved for public release, distribution unlimited Password cracking requires significant processing power, which in today's world is located at a workstation or home in the form of a desktop computer. Berkeley Open Infrastructure for Network Computing (BOINC) is the conduit to this significant source of processing power and John the Ripper is the key. BOINC is a distributed data processing system that incorporates client-server relationships to generically process data. The BOINC structu...

  11. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  12. Parton Distributions Working Group

    International Nuclear Information System (INIS)

    Barbaro, L. de; Keller, S. A.; Kuhlmann, S.; Schellman, H.; Tung, W.-K.

    2000-01-01

    This report summarizes the activities of the Parton Distributions Working Group of the QCD and Weak Boson Physics workshop held in preparation for Run II at the Fermilab Tevatron. The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, the authors introduce a Manifesto that describes an optimal method for reporting data

  13. Uniform-Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema Abid Al Kadhim

    2017-11-01

    Full Text Available we introduce (uniform-Preato distribution U-PD, we discusses some of its properties, distribution, probability density, reliability function, hazard, reserved hazard functions, moments, mode median and its order statistics. Furthermore, the study estimates the shape parameter. We also introduce the simulation study about the estimation of the parameter and the survival function and the application using the data about "spina bifida" disease that the name of the most common birth defect in Babylon province.

  14. Agile & Distributed Project Management

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene

    2011-01-01

    Scrum has gained surprising momentum as an agile IS project management approach. An obvious question is why Scrum is so useful? To answer that question we carried out a longitudinal study of a distributed project using Scrum. We analyzed the data using coding and categorisation and three carefully...... and coordination mechanisms by allowing both local and global articulation of work in the project. That is why Scrum is especially useful for distributed IS project management and teamwork....

  15. Parton Distributions Working Group

    Energy Technology Data Exchange (ETDEWEB)

    de Barbaro, L.; Keller, S. A.; Kuhlmann, S.; Schellman, H.; Tung, W.-K.

    2000-07-20

    This report summarizes the activities of the Parton Distributions Working Group of the QCD and Weak Boson Physics workshop held in preparation for Run II at the Fermilab Tevatron. The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, the authors introduce a Manifesto that describes an optimal method for reporting data.

  16. TensorFlow Distributions

    OpenAIRE

    Dillon, Joshua V.; Langmore, Ian; Tran, Dustin; Brevdo, Eugene; Vasudevan, Srinivas; Moore, Dave; Patton, Brian; Alemi, Alex; Hoffman, Matt; Saurous, Rif A.

    2017-01-01

    The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabilistic computation. Distributions provide fast, numerically stable methods for generating samples and computing statistics, e.g., log density. Bijectors provide composable volume-tracking transformations with automatic caching. Together these enable...

  17. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  18. Distribution System Pricing with Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Hledik, Ryan [The Brattle Group, Cambridge, MA (United States); Lazar, Jim [The Regulatory Assistance Project, Montpelier, VT (United States); Schwartz, Lisa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-08-16

    Technological changes in the electric utility industry bring tremendous opportunities and significant challenges. Customers are installing clean sources of on-site generation such as rooftop solar photovoltaic (PV) systems. At the same time, smart appliances and control systems that can communicate with the grid are entering the retail market. Among the opportunities these changes create are a cleaner and more diverse power system, the ability to improve system reliability and system resilience, and the potential for lower total costs. Challenges include integrating these new resources in a way that maintains system reliability, provides an equitable sharing of system costs, and avoids unbalanced impacts on different groups of customers, including those who install distributed energy resources (DERs) and low-income households who may be the least able to afford the transition.

  19. An Extended Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Mohamad Mead

    2014-10-01

    Full Text Available For the first time, a new continuous distribution, called the generalized beta exponentiated Pareto type I (GBEP [McDonald exponentiated Pareto] distribution, is defined and investigated. The new distribution contains as special sub-models some well-known and not known distributions, such as the generalized beta Pareto (GBP [McDonald Pareto], the Kumaraswamy exponentiated Pareto (KEP, Kumaraswamy Pareto (KP, beta exponentiated Pareto (BEP, beta Pareto (BP, exponentiated Pareto (EP and Pareto, among several others. Various structural properties of the new distribution are derived, including explicit expressions for the moments, moment generating function, incomplete moments, quantile function, mean deviations and Rényi entropy. Lorenz, Bonferroni and Zenga curves are derived. The method of maximum likelihood is proposed for estimating the model parameters. We obtain the observed information matrix. The usefulness of the new model is illustrated by means of two real data sets. We hope that this generalization may attract wider applications in reliability, biology and lifetime data analysis.

  20. Distributed processor systems

    International Nuclear Information System (INIS)

    Zacharov, B.

    1976-01-01

    In recent years, there has been a growing tendency in high-energy physics and in other fields to solve computational problems by distributing tasks among the resources of inter-coupled processing devices and associated system elements. This trend has gained further momentum more recently with the increased availability of low-cost processors and with the development of the means of data distribution. In two lectures, the broad question of distributed computing systems is examined and the historical development of such systems reviewed. An attempt is made to examine the reasons for the existence of these systems and to discern the main trends for the future. The components of distributed systems are discussed in some detail and particular emphasis is placed on the importance of standards and conventions in certain key system components. The ideas and principles of distributed systems are discussed in general terms, but these are illustrated by a number of concrete examples drawn from the context of the high-energy physics environment. (Auth.)

  1. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  2. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  3. Distributed data transmitter

    Science.gov (United States)

    Brown, Kenneth Dewayne [Grain Valley, MO; Dunson, David [Kansas City, MO

    2008-06-03

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  4. Electricity Distribution Effectiveness

    Directory of Open Access Journals (Sweden)

    Waldemar Szpyra

    2015-12-01

    Full Text Available This paper discusses the basic concepts of cost accounting in the power industry and selected ways of assessing the effectiveness of electricity distribution. The results of effectiveness analysis of MV/LV distribution transformer replacement are presented, and unit costs of energy transmission through various medium-voltage line types are compared. The calculation results confirm the viability of replacing transformers manufactured before 1975. Replacing transformers manufactured after 1975 – only to reduce energy losses – is not economically justified. Increasing use of a PAS type line for energy transmission in local distribution networks is reasonable. Cabling these networks under the current calculation rules of discounts for excessive power outages is not viable, even in areas particularly exposed to catastrophic wire icing.

  5. Industrial power distribution

    CERN Document Server

    Fehr, Ralph

    2016-01-01

    In this fully updated version of Industrial Power Distribution, the author addresses key areas of electric power distribution from an end-user perspective for both electrical engineers, as well as students who are training for a career in the electrical power engineering field. Industrial Power Distribution, Second Edition, begins by describing how industrial facilities are supplied from utility sources, which is supported with background information on the components of AC power, voltage drop calculations, and the sizing of conductors and transformers. Important concepts and discussions are featured throughout the book including those for sequence networks, ladder logic, motor application, fault calculations, and transformer connections. The book concludes with an introduction to power quality, how it affects industrial power systems, and an expansion of the concept of power factor, including a distortion term made necessary by the existence of harmonic.

  6. Distributed Wind Market Applications

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, T.; Baring-Gould, I.

    2007-11-01

    Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, central station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.

  7. Discrete Pearson distributions

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, K.O. [Oak Ridge National Lab., TN (United States); Shenton, L.R. [Georgia Univ., Athens, GA (United States); Kastenbaum, M.A. [Kastenbaum (M.A.), Basye, VA (United States)

    1991-11-01

    These distributions are generated by a first order recursive scheme which equates the ratio of successive probabilities to the ratio of two corresponding quadratics. The use of a linearized form of this model will produce equations in the unknowns matched by an appropriate set of moments (assumed to exist). Given the moments we may find valid solutions. These are two cases; (1) distributions defined on the non-negative integers (finite or infinite) and (2) distributions defined on negative integers as well. For (1), given the first four moments, it is possible to set this up as equations of finite or infinite degree in the probability of a zero occurrence, the sth component being a product of s ratios of linear forms in this probability in general. For (2) the equation for the zero probability is purely linear but may involve slowly converging series; here a particular case is the discrete normal. Regions of validity are being studied. 11 refs.

  8. Distributed System Contract Monitoring

    Directory of Open Access Journals (Sweden)

    Adrian Francalanza Ph.D

    2011-09-01

    Full Text Available The use of behavioural contracts, to specify, regulate and verify systems, is particularly relevant to runtime monitoring of distributed systems. System distribution poses major challenges to contract monitoring, from monitoring-induced information leaks to computation load balancing, communication overheads and fault-tolerance. We present mDPi, a location-aware process calculus, for reasoning about monitoring of distributed systems. We define a family of Labelled Transition Systems for this calculus, which allow formal reasoning about different monitoring strategies at different levels of abstractions. We also illustrate the expressivity of the calculus by showing how contracts in a simple contract language can be synthesised into different mDPi monitors.

  9. A distribution network review

    International Nuclear Information System (INIS)

    Fairbairn, R.J.; Maunder, D.; Kenyon, P.

    1999-01-01

    This report summarises the findings of a study reviewing the distribution network in England, Scotland and Wales to evaluate its ability to accommodate more embedded generation from both fossil fuel and renewable energy sources. The background to the study is traced, and descriptions of the existing electricity supply system, the licence conditions relating to embedded generation, and the effects of the Review of Electricity Trading Arrangements are given. The ability of the UK distribution networks to accept embedded generation is examined, and technical benefits/drawbacks arising from embedded generation, and the potential for uptake of embedded generation technologies are considered. The distribution network capacity and the potential uptake of embedded generation are compared, and possible solutions to overcome obstacles are suggested. (UK)

  10. A distribution network review

    Energy Technology Data Exchange (ETDEWEB)

    Fairbairn, R.J.; Maunder, D.; Kenyon, P.

    1999-07-01

    This report summarises the findings of a study reviewing the distribution network in England, Scotland and Wales to evaluate its ability to accommodate more embedded generation from both fossil fuel and renewable energy sources. The background to the study is traced, and descriptions of the existing electricity supply system, the licence conditions relating to embedded generation, and the effects of the Review of Electricity Trading Arrangements are given. The ability of the UK distribution networks to accept embedded generation is examined, and technical benefits/drawbacks arising from embedded generation, and the potential for uptake of embedded generation technologies are considered. The distribution network capacity and the potential uptake of embedded generation are compared, and possible solutions to overcome obstacles are suggested. (UK)

  11. Spectral distributions and symmetries

    International Nuclear Information System (INIS)

    Quesne, C.

    1980-01-01

    As it is now well known, the spectral distribution method has both statistical and group theoretical aspects which make for great simplifications in many-Fermion system calculations with respect to more conventional ones. Although both aspects intertwine and are equally essential to understand what is going on, we are only going to discuss some of the group theoretical aspects, namely those connected with the propagation of information, in view of their fundamental importance for the actual calculations of spectral distributions. To be more precise, let us recall that the spectral distribution method may be applied in principle to many-Fermion spaces which have a direct-product structure, i.e., are obtained by distributing a certain number n of Fermions over N single-particle states (O less than or equal to n less than or equal to N), as it is the case for instance for the nuclear shell model spaces. For such systems, the operation of a central limit theorem is known to provide us with a simplifying principle which, when used in conjunction with exact or broken symmetries, enables us to make definite predictions in those cases which are not amendable to exact shell model diagonalizations. The distribution (in energy) of the states corresponding to a fixed symmetry is then defined by a small number of low-order energy moments. Since the Hamiltonian is defined in few-particle subspaces embedded in the n-particlespace, the low-order moments, we are interested in, can be expressed in terms of simpler quantities defined in those few-particle subspaces: the information is said to propagate from the simple subspaces to the more complicated ones. The possibility of actually calculating spectral distributions depends upon the finding of simple ways to propagate the information

  12. Remote entanglement distribution

    International Nuclear Information System (INIS)

    Sanders, B.C.; Gour, G.; Meyer, D.A.

    2005-01-01

    Full text: Shared bipartite entanglement is a crucial shared resource for many quantum information tasks such as teleportation, entanglement swapping, and remote state preparation. In general different nodes of a quantum network share an entanglement resource, such as ebits, that are consumed during the task. In practice, generating entangled states is expensive, but here we establish a protocol by which a quantum network requires only a single supplier of entanglement to all nodes who, by judicious measurements and classical communication, provides the nodes with a unique pair wise entangled state independent of the measurement outcome. Furthermore, we extend this result to a chain of suppliers and nodes, which enables an operational interpretation of concurrence. In the special case that the supplier shares bipartite states with two nodes, and such states are pure and maximally entangled, our protocol corresponds to entanglement swapping. However, in the practical case that initial shared entanglement between suppliers and nodes involves partially entangled or mixed states, we show that general local operations and classical communication by all parties (suppliers and nodes) yields distributions of entangled states between nodes. In general a distribution of bipartite entangled states between any two nodes will include states that do not have the same entanglement; thus we name this general process remote entanglement distribution. In our terminology entanglement swapping with partially entangled states is a particular class of remote entanglement distribution protocols. Here we identify which distributions of states that can or cannot be created by remote entanglement distribution. In particular we prove a powerful theorem that establishes an upper bound on the entanglement of formation that can be produced between two qubit nodes. We extend this result to the case of a linear chain of parties that play the roles of suppliers and nodes; this extension provides

  13. Analysis on Voltage Profile of Distribution Network with Distributed Generation

    Science.gov (United States)

    Shao, Hua; Shi, Yujie; Yuan, Jianpu; An, Jiakun; Yang, Jianhua

    2018-02-01

    Penetration of distributed generation has some impacts on a distribution network in load flow, voltage profile, reliability, power loss and so on. After the impacts and the typical structures of the grid-connected distributed generation are analyzed, the back/forward sweep method of the load flow calculation of the distribution network is modelled including distributed generation. The voltage profiles of the distribution network affected by the installation location and the capacity of distributed generation are thoroughly investigated and simulated. The impacts on the voltage profiles are summarized and some suggestions to the installation location and the capacity of distributed generation are given correspondingly.

  14. Rayleigh Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema ‎ Abed Al-Kadim

    2017-12-01

    Full Text Available In this paper Rayleigh Pareto distribution have  introduced denote by( R_PD. We stated some  useful functions. Therefor  we  give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters  so the aim of this search  is to introduce a new distribution

  15. Liquidity, welfare and distribution

    Directory of Open Access Journals (Sweden)

    Martín Gil Samuel

    2012-01-01

    Full Text Available This work presents a dynamic general equilibrium model where wealth distribution is endogenous. I provide channels of causality that suggest a complex relationship between financial markets and the real activity which breaks down the classical dichotomy. As a consequence, the Friedman rule does not hold. In terms of the current events taking place in the world economy, this paper provides a rationale to advert against the perils of an economy satiated with liquidity. Efficiency and distribution cannot thus be considered as separate attributes once we account for the interactions between financial markets and the economic performance.

  16. The Signal Distribution System

    CERN Document Server

    Belohrad, D; CERN. Geneva. AB Department

    2005-01-01

    For the purpose of LHC signal observation and high frequency signal distribution, the Signal Distribution System (SDS) was built. The SDS can contain up to 5 switching elements, where each element allows the user to switch between one of the maximum 8 bi-directional signals. The coaxial relays are used to switch the signals. Depending of the coaxial relay type used, the transfer bandwidth can go up to 18GHz. The SDS is controllable via TCP/IP, parallel port, or locally by rotary switch.

  17. Distributed Project Work

    DEFF Research Database (Denmark)

    Borch, Ole; Kirkegaard, B.; Knudsen, Morten

    1998-01-01

    in a distributed fashion over the Internet needs more attention to the interaction protocol since the physical group room is not existing. The purpose in this paper is to develop a method for online project work by using the product: Basic Support for Cooperative Work (BSCV). An analysis of a well-proven protocol...... to be very precises and with success used on the second test group. Distributed project work is coming pretty soon and with little improvement in server tools, projects in different topics with a large and inhomogeneous profile of users are realistic....

  18. A Distributed Tier-1

    DEFF Research Database (Denmark)

    Fischer, Lars; Grønager, Michael; Kleist, Josva

    2008-01-01

    The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single...... organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring....

  19. Securing Distributed Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Global science calls for global infrastructure. A typical large-scale research group will use a suite of international services and involve hundreds of collaborating institutes and users from around the world. How can these users access those services securely? How can their digital identities be established, verified and maintained? We will explore the motivation for distributed authentication and the ways in which research communities are addressing the challenges. We will discuss security incident response in distributed environments - a particular challenge for the operators of these infrastructures. Through this course you should gain an overview of federated identity technologies and protocols, including x509 certificates, SAML and OIDC.

  20. Distributed photovoltaic grid transformers

    CERN Document Server

    Shertukde, Hemchandra Madhusudan

    2014-01-01

    The demand for alternative energy sources fuels the need for electric power and controls engineers to possess a practical understanding of transformers suitable for solar energy. Meeting that need, Distributed Photovoltaic Grid Transformers begins by explaining the basic theory behind transformers in the solar power arena, and then progresses to describe the development, manufacture, and sale of distributed photovoltaic (PV) grid transformers, which help boost the electric DC voltage (generally at 30 volts) harnessed by a PV panel to a higher level (generally at 115 volts or higher) once it is

  1. Nuclear parton distributions

    Directory of Open Access Journals (Sweden)

    Kulagin S. A.

    2017-01-01

    Full Text Available We review a microscopic model of the nuclear parton distribution functions, which accounts for a number of nuclear effects including Fermi motion and nuclear binding, nuclear meson-exchange currents, off-shell corrections to bound nucleon distributions and nuclear shadowing. We also discuss applications of this model to a number of processes including lepton-nucleus deep inelastic scattering, proton-nucleus Drell-Yan lepton pair production at Fermilab, as well as W± and Z0 boson production in proton-lead collisions at the LHC.

  2. Theory of distributions

    CERN Document Server

    Georgiev, Svetlin G

    2015-01-01

    This book explains many fundamental ideas on the theory of distributions. The theory of partial differential equations is one of the synthetic branches of analysis that combines ideas and methods from different fields of mathematics, ranging from functional analysis and harmonic analysis to differential geometry and topology. This presents specific difficulties to those studying this field. This book, which consists of 10 chapters, is suitable for upper undergraduate/graduate students and mathematicians seeking an accessible introduction to some aspects of the theory of distributions. It can also be used for one-semester course.

  3. Distributed Spacecraft Control Architectures

    Science.gov (United States)

    Carpenter, James Russell; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    A fundamental issue for estimation and control of distributed systems such as formation flying spacecraft is the information exchange architecture. In centralized schemes, each subordinate need only share its measurement data with a central hub, and the subordinates depend on the center to direct their actions. In decentralized schemes, all nodes participate in the data exchange, so that each has the same in by formation as the center, and may thereby self-direct the same action that the center would have commanded, assuming all share a common goal. This talk compares and contrasts the centralized and decentralized schemes in the context of autonomously maintaining a distributed satellite formation.

  4. Analysis of meteorological droughts and dry spells in semiarid regions: a comparative analysis of probability distribution functions in the Segura Basin (SE Spain)

    Science.gov (United States)

    Pérez-Sánchez, Julio; Senent-Aparicio, Javier

    2017-08-01

    Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.

  5. Gap length distributions by PEPR

    International Nuclear Information System (INIS)

    Warszawer, T.N.

    1980-01-01

    Conditions guaranteeing exponential gap length distributions are formulated and discussed. Exponential gap length distributions of bubble chamber tracks first obtained on a CRT device are presented. Distributions of resulting average gap lengths and their velocity dependence are discussed. (orig.)

  6. Bilateral matrix-exponential distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis

    2012-01-01

    In this article we define the classes of bilateral and multivariate bilateral matrix-exponential distributions. These distributions have support on the entire real space and have rational moment-generating functions. These distributions extend the class of bilateral phasetype distributions of [1]....... As an application we demonstrate that certain multivariate disions, which are governed by the underlying Markov jump process generating a phasetype distribution, have a bilateral matrix-exponential distribution at the time of absorption, see also [4]....

  7. Dissipative distributed systems

    NARCIS (Netherlands)

    Willems, JC; Djaferis, TE; Schick, IC

    2000-01-01

    A controllable distributed dynamical system described by a system of linear constant-coefficient partial differential equations is said to be conservative if for compact support trajectories the integral of the supply rate is zero. It is said to be dissipative if this integral is non-negative. The

  8. Factor Determining Income Distribution

    NARCIS (Netherlands)

    J. Tinbergen (Jan)

    1972-01-01

    textabstractSince the phrase income distribution covers a large number of different concepts, it is necessary to define these and to indicate the choice made in this article. Income for a given recipient may cover lists of items which are not always the same. Apart from popular misunderstandings

  9. Two Photon Distribution Amplitudes

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.

    2008-01-01

    The factorization of the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region is demonstrated at the Born order and in the leading logarithmic approximation. The leading order two photon (generalized) distribution amplitudes exhibit a characteristic ln Q 2 behaviour and obey new inhomogeneous evolution equations

  10. Distributed Treatment Systems.

    Science.gov (United States)

    Zgonc, David; Plante, Luke

    2017-10-01

    This section presents a review of the literature published in 2016 on topics relating to distributed treatment systems. This review is divided into the following sections with multiple subsections under each: constituent removal; treatment technologies; and planning and treatment system management.

  11. Tensions in Distributed Leadership

    Science.gov (United States)

    Ho, Jeanne; Ng, David

    2017-01-01

    Purpose: This article proposes the utility of using activity theory as an analytical lens to examine the theoretical construct of distributed leadership, specifically to illuminate tensions encountered by leaders and how they resolved these tensions. Research Method: The study adopted the naturalistic inquiry approach of a case study of an…

  12. Species Distribution Modelling

    DEFF Research Database (Denmark)

    Gomes, Vitor H. F.; Ijff, Stephanie D.; Raes, Niels

    2018-01-01

    Species distribution models (SDMs) are widely used in ecology and conservation. Presence-only SDMs such as MaxEnt frequently use natural history collections (NHCs) as occurrence data, given their huge numbers and accessibility. NHCs are often spatially biased which may generate inaccuracies in SD...

  13. Enabling distributed petascale science

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Bharathi, Shishir; Bresnahan, John

    2007-01-01

    Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science

  14. A distributed multimedia toolbox

    NARCIS (Netherlands)

    Scholten, Johan; Jansen, P.G.

    1997-01-01

    Emphasis of our research lies on the application of realtime multimedia technology: tele-teaching, teleconferencing and collaborative work. To support this research we need a real-time environment that supports rapid prototyping of distributed multimedia applications. Because other systems were not

  15. Distributed analysis in ATLAS

    CERN Document Server

    Dewhurst, Alastair; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  16. The Normal Distribution

    Indian Academy of Sciences (India)

    tion in statistics, velocity distribution of an ideal gas, and the phenomenon of Brownian motion is briefly illustrated. Introduction. To compensate for the hard work done in part I of this series, we basically pontificate in this article. Mathematical details are side-stepped and we indulge in a lot of 'hand-waving', especially in the ...

  17. Hyperfinite representation of distributions

    Indian Academy of Sciences (India)

    A nonstandard treatment of the theory of distributions in terms of a hyperfinite representa- tion has been presented in papers [2,3] by Kinoshita. A further exploitation of this treatment in an N-dimensional context has been given by Grenier [1]. In the present paper we offer a different approach to the hyperfinite representation, ...

  18. World distribution of Owlaholics

    Science.gov (United States)

    Heimo Mikkola

    1997-01-01

    Owlaholics are people who collect anything with an owl on it. This paper gives the most common reasons how and why people become addicted to owls and shows their known distribution. Although thousands of owl collectors and enthusiasts reside all over the world, the majority live in Europe and the United States. While no evidence exists of owl collecting clubs in Latin...

  19. Multiagent distributed watershed management

    Science.gov (United States)

    Giuliani, M.; Castelletti, A.; Amigoni, F.; Cai, X.

    2012-04-01

    Deregulation and democratization of water along with increasing environmental awareness are challenging integrated water resources planning and management worldwide. The traditional centralized approach to water management, as described in much of water resources literature, is often unfeasible in most of the modern social and institutional contexts. Thus it should be reconsidered from a more realistic and distributed perspective, in order to account for the presence of multiple and often independent Decision Makers (DMs) and many conflicting stakeholders. Game theory based approaches are often used to study these situations of conflict (Madani, 2010), but they are limited to a descriptive perspective. Multiagent systems (see Wooldridge, 2009), instead, seem to be a more suitable paradigm because they naturally allow to represent a set of self-interested agents (DMs and/or stakeholders) acting in a distributed decision process at the agent level, resulting in a promising compromise alternative between the ideal centralized solution and the actual uncoordinated practices. Casting a water management problem in a multiagent framework allows to exploit the techniques and methods that are already available in this field for solving distributed optimization problems. In particular, in Distributed Constraint Satisfaction Problems (DCSP, see Yokoo et al., 2000), each agent controls some variables according to his own utility function but has to satisfy inter-agent constraints; while in Distributed Constraint Optimization Problems (DCOP, see Modi et al., 2005), the problem is generalized by introducing a global objective function to be optimized that requires a coordination mechanism between the agents. In this work, we apply a DCSP-DCOP based approach to model a steady state hypothetical watershed management problem (Yang et al., 2009), involving several active human agents (i.e. agents who make decisions) and reactive ecological agents (i.e. agents representing

  20. A distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P.; Verho, P.; Kaerenlampi, M.; Pitkaenen, M. [Tampere Univ. of Technology (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The development of new distribution automation applications is considerably wide nowadays. One of the most interesting areas is the development of a distribution management system (DMS) as an expansion to the traditional SCADA system. At the power transmission level such a system is called an energy management system (EMS). The idea of these expansions is to provide supporting tools for control center operators in system analysis and operation planning. Nowadays the SCADA is the main computer system (and often the only) in the control center. However, the information displayed by the SCADA is often inadequate, and several tasks cannot be solved by a conventional SCADA system. A need for new computer applications in control center arises from the insufficiency of the SCADA and some other trends. The latter means that the overall importance of the distribution networks is increasing. The slowing down of load-growth has often made network reinforcements unprofitable. Thus the existing network must be operated more efficiently. At the same time larger distribution areas are for economical reasons being monitored at one control center and the size of the operation staff is decreasing. The quality of supply requirements are also becoming stricter. The needed data for new applications is mainly available in some existing systems. Thus the computer systems of utilities must be integrated. The main data source for the new applications in the control center are the AM/FM/GIS (i.e. the network database system), the SCADA, and the customer information system (CIS). The new functions can be embedded in some existing computer system. This means a strong dependency on the vendor of the existing system. An alternative strategy is to develop an independent system which is integrated with other computer systems using well-defined interfaces. The latter approach makes it possible to use the new applications in various computer environments, having only a weak dependency on the

  1. Summer Steelhead Distribution [ds341

    Data.gov (United States)

    California Department of Resources — Summer Steelhead Distribution October 2009 Version This dataset depicts observation-based stream-level geographic distribution of anadromous summer-run steelhead...

  2. Summer Steelhead Distribution [ds341

    Data.gov (United States)

    California Natural Resource Agency — Summer Steelhead Distribution October 2009 Version This dataset depicts observation-based stream-level geographic distribution of anadromous summer-run steelhead...

  3. Distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Verho, P.; Kaerenlampi, M.; Pitkaenen, M.; Jaerventausta, P.; Partanen, J.

    1997-12-31

    This report comprises a general description of the results obtained in the research projects `Information system applications of a distribution control center`, `Event analysis in primary substation`, and `Distribution management system` of the EDISON research program during the years of 1993 - 1997. The different domains of the project are presented in more detail in other reports. An operational state analysis of a distribution network has been made from the control center point of view and the functions which can not be solved by a conventional SCADA system are determined. The basis for new computer applications is shown to be integration of the computer systems. The main result of the work is a distribution management system (DMS), which is an autonomous system integrated to the existing information systems, SCADA and AM/FM/GIS. The system uses a large number of modelling and computation methods and provides an extensive group of advanced functions to support the distribution network monitoring, fault management, operations planning and optimization. The development platform of the system consists of a Visual C++ programming environment, Windows NT operating system and PC. During the development the DMS has been tested in a pilot utility and it is nowadays in practical use in several Finnish utilities. The use of a DMS improves the quality and economy of power supply in many ways; the outage times can, in particular, be reduced using the system. Based on the achieved experiences some parts of the DMS reached the commercialization phase, too. Initially the commercial products were developed by a software company, Versoft Oy. At present the research results are the basis of a worldwide software product supplied by ABB Transmit Co. (orig.) EDISON Research Programme. 28 refs.

  4. The distributional properties of the family of logistic distributions ...

    African Journals Online (AJOL)

    The distributional properties of half logistic distribution and Type I generalized logistic distribution were studied, bringing out the L-moments (up to order four) of each of these. Skewness and Kurtosis were obtained. Keywords: Logistic distribution, L-moments ...

  5. Distribution view: a tool to write and simulate distributions

    OpenAIRE

    Coelho, José; Branco, Fernando; Oliveira, Teresa

    2006-01-01

    In our work we present a tool to write and simulate distributions. This tool allows to write mathematical expressions which can contain not only functions and variables, but also statistical distributions, including mixtures. Each time the expression is evaluated, for all inner distributions, is generated a value according to the distribution and is used for expression value determination. The inversion method can be used in this language, allowing to generate all distributions...

  6. Loss optimization in distribution networks with distributed generation

    DEFF Research Database (Denmark)

    Pokhrel, Basanta Raj; Nainar, Karthikeyan; Bak-Jensen, Birgitte

    2017-01-01

    in highly active distribution grids. This issue is tackled by formulating a hybrid loss optimization problem and solved using the Interior Point Method. Sensitivity analysis is used to identify the optimum location of storage units. Different scenarios of reconfiguration, storage and distributed generation......This paper presents a novel power loss minimization approach in distribution grids considering network reconfiguration, distributed generation and storage installation. Identification of optimum configuration in such scenario is one of the main challenges faced by distribution system operators...

  7. Light Meson Distribution Amplitudes

    CERN Document Server

    Arthur, R.; Brommel, D.; Donnellan, M.A.; Flynn, J.M.; Juttner, A.; de Lima, H.Pedroso; Rae, T.D.; Sachrajda, C.T.; Samways, B.

    2010-01-01

    We calculated the first two moments of the light-cone distribution amplitudes for the pseudoscalar mesons ($\\pi$ and $K$) and the longitudinally polarised vector mesons ($\\rho$, $K^*$ and $\\phi$) as part of the UKQCD and RBC collaborations' $N_f=2+1$ domain-wall fermion phenomenology programme. These quantities were obtained with a good precision and, in particular, the expected effects of $SU(3)$-flavour symmetry breaking were observed. Operators were renormalised non-perturbatively and extrapolations to the physical point were made, guided by leading order chiral perturbation theory. The main results presented are for two volumes, $16^3\\times 32$ and $24^3\\times 64$, with a common lattice spacing. Preliminary results for a lattice with a finer lattice spacing, $32^3\\times64$, are discussed and a first look is taken at the use of twisted boundary conditions to extract distribution amplitudes.

  8. ``Just Another Distribution Channel?''

    Science.gov (United States)

    Lemstra, Wolter; de Leeuw, Gerd-Jan; van de Kar, Els; Brand, Paul

    The telecommunications-centric business model of mobile operators is under attack due to technological convergence in the communication and content industries. This has resulted in a plethora of academic contributions on the design of new business models and service platform architectures. However, a discussion of the challenges that operators are facing in adopting these models is lacking. We assess these challenges by considering the mobile network as part of the value system of the content industry. We will argue that from the perspective of a content provider the mobile network is ‘just another’ distribution channel. Strategic options available for the mobile communication operators are to deliver an excellent distribution channel for content delivery or to move upwards in the value chain by becoming a content aggregator. To become a mobile content aggregator operators will have to develop or acquire complementary resources and capabilities. Whether this strategic option is sustainable remains open.

  9. Navigating Distributed Services

    DEFF Research Database (Denmark)

    Beute, Berco

    2002-01-01

    of devices used to access information on the Internet.The focal point of the thesis is an initial exploration of the effects of the trends onusers as they navigate the virtual environment of distributed documents and services.To begin the thesis uses scenarios as a heuristic device to identify and analyse......This thesis explores the impact of three current trends which, when taken together, arefundamentally changing the way in which the task of navigating virtual environmentsis accomplished. The first concerns the changeover from a situation in which all dataand functionality reside locally to the user...... themain effects of the trends. This is followed by an exploration of theory of navigationInformation Spaces, which is in turn followed by an overview of theories, and the stateof the art in navigating distributed services. These explorations of both theory andpractice resulted in a large number of topics...

  10. Distribution load estimation (DLE)

    Energy Technology Data Exchange (ETDEWEB)

    Seppaelae, A.; Lehtonen, M. [VTT Energy, Espoo (Finland)

    1998-08-01

    The load research has produced customer class load models to convert the customers` annual energy consumption to hourly load values. The reliability of load models applied from a nation-wide sample is limited in any specific network because many local circumstances are different from utility to utility and time to time. Therefore there is a need to find improvements to the load models or, in general, improvements to the load estimates. In Distribution Load Estimation (DLE) the measurements from the network are utilized to improve the customer class load models. The results of DLE will be new load models that better correspond to the loading of the distribution network but are still close to the original load models obtained by load research. The principal data flow of DLE is presented

  11. DIRAC distributed computing services

    International Nuclear Information System (INIS)

    Tsaregorodtsev, A

    2014-01-01

    DIRAC Project provides a general-purpose framework for building distributed computing systems. It is used now in several HEP and astrophysics experiments as well as for user communities in other scientific domains. There is a large interest from smaller user communities to have a simple tool like DIRAC for accessing grid and other types of distributed computing resources. However, small experiments cannot afford to install and maintain dedicated services. Therefore, several grid infrastructure projects are providing DIRAC services for their respective user communities. These services are used for user tutorials as well as to help porting the applications to the grid for a practical day-to-day work. The services are giving access typically to several grid infrastructures as well as to standalone computing clusters accessible by the target user communities. In the paper we will present the experience of running DIRAC services provided by the France-Grilles NGI and other national grid infrastructure projects.

  12. Bounding species distribution models

    Directory of Open Access Journals (Sweden)

    Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE

    2011-10-01

    Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].

  13. Bounding Species Distribution Models

    Science.gov (United States)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  14. Distributed environmental control

    Science.gov (United States)

    Cleveland, Gary A.

    1992-01-01

    We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).

  15. Distributed Representation of Subgraphs

    OpenAIRE

    Adhikari, Bijaya; Zhang, Yao; Ramakrishnan, Naren; Prakash, B. Aditya

    2017-01-01

    Network embeddings have become very popular in learning effective feature representations of networks. Motivated by the recent successes of embeddings in natural language processing, researchers have tried to find network embeddings in order to exploit machine learning algorithms for mining tasks like node classification and edge prediction. However, most of the work focuses on finding distributed representations of nodes, which are inherently ill-suited to tasks such as community detection w...

  16. Distributed Problem-Solving

    DEFF Research Database (Denmark)

    Chemi, Tatiana

    2016-01-01

    a perspective that is relevant to higher education. The focus here is on how artists solve problems in distributed paths, and on the elements of creative collaboration. Creative problem-solving will be looked at as an ongoing dialogue that artists engage with themselves, with others, with recipients......, what can educators at higher education learn from the ways creative groups solve problems? How can artists contribute to inspiring higher education?...

  17. Distribution, Abundance and Assemblages

    African Journals Online (AJOL)

    E-mail: luis.silva@cd.ieo.es. Cephalopod Species in Mozambican Waters Caught in the. “Mozambique 0307” Survey: Distribution, Abundance and. Assemblages. Luis Silva1, Eduardo Balguerías2, Paula Santana Afonso3, Ignacio Sobrino1, Juan Gil1 and. Candelaria Burgos1. 1Instituto Español de Oceanografía Unidad de ...

  18. Distributional Watson transforms

    NARCIS (Netherlands)

    Dijksma, A.; Snoo, H.S.V. de

    1974-01-01

    For all Watson transforms W in L2(R+) a triple of Hilbert space LG ⊂ L2(R+) ⊂ L'G is constructed such that W may be extended to L'G. These results allow the construction of a triple L ⊂ L2(R+) ⊂ L', where L is a Gelfand-Fréchet space. This leads to a theory of distributional Watson transforms.

  19. Air Distribution in Rooms

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The research on air distribution in rooms is often done as full-size investigations, scale-model investigations or by Computational Fluid Dynamics (CFD). New activities have taken place within all three areas and this paper draws comparisons between the different methods. The outcome of the l......EA sponsored research "Air Flow Pattern within Buildings" is used for comparisons in some parts of the paper because various types of experiments and many countries are involved....

  20. Migration and income distribution.

    OpenAIRE

    Rodgers G

    1981-01-01

    ILO pub-WEP pub. Working paper based on a conference paper on models for analysis of interrelationships between labour mobility of migrant workers (migration) and income distribution in developing countries - includes a literature survey of empirical research, and covers labour market absorption of migrant rural workers, effects of rural areas-urban areas wage differentials on migration, impact of migration on wages, etc. References. Conference held in Ahmedabad 1981 Jan.

  1. Structure functions and parton distributions

    International Nuclear Information System (INIS)

    Olness, F.; Tung, Wu-Ki

    1991-04-01

    Activities of the structure functions and parton distributions group is summarized. The impact of scheme-dependence of parton distributions (especially sea-quarks and gluons) on the quantitative formulation of the QCD parton model is highlighted. Recent progress on the global analysis of parton distributions is summarized. Issues on the proper use of the next-to-leading parton distributions are stressed

  2. Majorization and extremal PH distributions

    NARCIS (Netherlands)

    He, Q.M.; Zhang, H.; Vera, J.C.; Latouche, G.; Ramaswami, V.; Sethuraman, J.; Sigman, K.; Squillante, M.S.; Yao, D.D.

    2013-01-01

    This chapter presents majorization results for PH generators. Based on the majorization results, bounds on the moments and Laplace–Stieltjes transforms of phase-type distributions are found. Exponential distributions and Coxian distributions are identified to be extremal PH distributions with

  3. Reactor power distribution monitor

    International Nuclear Information System (INIS)

    Sekimizu, Koichi

    1980-01-01

    Purpose: To improve the performance and secure the safety of a nuclear reactor by rapidly computing and display the power density in the nuclear reactor by using a plurality of processors. Constitution: Plant data for a nuclear reactor containing the measured values from a local power monitor LPRM are sent and recorded in a magnetic disc. They are also sent to a core performance computer in which burn-up degree distribution and the like are computed, and the results are sent and recorded in the magnetic disc. A central processors loads programs to each of the processors and applies data recorded in the magnetic disc to each of the processors. Each of the processors computes the corresponding power distribution in four fuel assemblies surrounding the LPRM string by the above information. The central processor compiles the computation results and displays them on a display. In this way, power distribution in the fuel assemblies can rapidly be computed to thereby secure the improvement of the performance and safety of the reactor. (Seki, T.)

  4. Distributed sensor networks

    CERN Document Server

    Rubin, Donald B; Carlin, John B; Iyengar, S Sitharama; Brooks, Richard R; University, Clemson

    2014-01-01

    An Overview, S.S. Iyengar, Ankit Tandon, and R.R. BrooksMicrosensor Applications, David ShepherdA Taxonomy of Distributed Sensor Networks, Shivakumar Sastry and S.S. IyengarContrast with Traditional Systems, R.R. BrooksDigital Signal Processing Background, Yu Hen HuImage-Processing Background Lynne Grewe and Ben ShahshahaniObject Detection and Classification, Akbar M. SayeedParameter Estimation David FriedlanderTarget Tracking with Self-Organizing Distributed Sensors R.R. Brooks, C. Griffin, D.S. Friedlander, and J.D. KochCollaborative Signal and Information Processing: AnInformation-Directed Approach Feng Zhao, Jie Liu, Juan Liu, Leonidas Guibas, and James ReichEnvironmental Effects, David C. SwansonDetecting and Counteracting Atmospheric Effects Lynne L. GreweSignal Processing and Propagation for Aeroacoustic Sensor Networks, Richard J. Kozick, Brian M. Sadler, and D. Keith WilsonDistributed Multi-Target Detection in Sensor Networks Xiaoling Wang, Hairong Qi, and Steve BeckFoundations of Data Fusion f...

  5. Angular Distribution of GRBs

    Directory of Open Access Journals (Sweden)

    L. G. Balázs

    2012-01-01

    Full Text Available We studied the complete randomness of the angular distribution of BATSE gamma-ray bursts (GRBs. Based on their durations and peak fluxes, we divided the BATSE sample into 5 subsamples (short1, short2, intermediate, long1, long2 and studied the angular distributions separately. We used three methods to search for non-randomness in the subsamples: Voronoi tesselation, minimal spanning tree, and multifractal spectra. To study any non-randomness in the subsamples we defined 13 test-variables (9 from Voronoi tesselation, 3 from the minimal spanning tree and one from the multifractal spectrum. We made Monte Carlo simulations taking into account the BATSE’s sky-exposure function. We tested therandomness by introducing squared Euclidean distances in the parameter space of the test-variables. We recognized that the short1, short2 groups deviate significantly (99.90%, 99.98% from the fully random case in the distribution of the squared Euclidean distances but this is not true for the long samples. In the intermediate group, the squared Euclidean distances also give significant deviation (98.51%.

  6. Multivariate Matrix-Exponential Distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    be written as linear combinations of the elements in the exponential of a matrix. For this reason we shall refer to multivariate distributions with rational Laplace transform as multivariate matrix-exponential distributions (MVME). The marginal distributions of an MVME are univariate matrix......-exponential distributions. We prove a characterization that states that a distribution is an MVME distribution if and only if all non-negative, non-null linear combinations of the coordinates have a univariate matrix-exponential distribution. This theorem is analog to a well-known characterization theorem...

  7. GASIFICATION FOR DISTRIBUTED GENERATION

    Energy Technology Data Exchange (ETDEWEB)

    Ronald C. Timpe; Michael D. Mann; Darren D. Schmidt

    2000-05-01

    A recent emphasis in gasification technology development has been directed toward reduced-scale gasifier systems for distributed generation at remote sites. The domestic distributed power generation market over the next decade is expected to be 5-6 gigawatts per year. The global increase is expected at 20 gigawatts over the next decade. The economics of gasification for distributed power generation are significantly improved when fuel transport is minimized. Until recently, gasification technology has been synonymous with coal conversion. Presently, however, interest centers on providing clean-burning fuel to remote sites that are not necessarily near coal supplies but have sufficient alternative carbonaceous material to feed a small gasifier. Gasifiers up to 50 MW are of current interest, with emphasis on those of 5-MW generating capacity. Internal combustion engines offer a more robust system for utilizing the fuel gas, while fuel cells and microturbines offer higher electric conversion efficiencies. The initial focus of this multiyear effort was on internal combustion engines and microturbines as more realistic near-term options for distributed generation. In this project, we studied emerging gasification technologies that can provide gas from regionally available feedstock as fuel to power generators under 30 MW in a distributed generation setting. Larger-scale gasification, primarily coal-fed, has been used commercially for more than 50 years to produce clean synthesis gas for the refining, chemical, and power industries. Commercial-scale gasification activities are under way at 113 sites in 22 countries in North and South America, Europe, Asia, Africa, and Australia, according to the Gasification Technologies Council. Gasification studies were carried out on alfalfa, black liquor (a high-sodium waste from the pulp industry), cow manure, and willow on the laboratory scale and on alfalfa, black liquor, and willow on the bench scale. Initial parametric tests

  8. Estimating cost ratio distribution between fatal and non-fatal road accidents in Malaysia

    Science.gov (United States)

    Hamdan, Nurhidayah; Daud, Noorizam

    2014-07-01

    Road traffic crashes are a global major problem, and should be treated as a shared responsibility. In Malaysia, road accident tragedies kill 6,917 people and injure or disable 17,522 people in year 2012, and government spent about RM9.3 billion in 2009 which cost the nation approximately 1 to 2 percent loss of gross domestic product (GDP) reported annually. The current cost ratio for fatal and non-fatal accident used by Ministry of Works Malaysia simply based on arbitrary value of 6:4 or equivalent 1.5:1 depends on the fact that there are six factors involved in the calculation accident cost for fatal accident while four factors for non-fatal accident. The simple indication used by the authority to calculate the cost ratio is doubted since there is lack of mathematical and conceptual evidence to explain how this ratio is determined. The main aim of this study is to determine the new accident cost ratio for fatal and non-fatal accident in Malaysia based on quantitative statistical approach. The cost ratio distributions will be estimated based on Weibull distribution. Due to the unavailability of official accident cost data, insurance claim data both for fatal and non-fatal accident have been used as proxy information for the actual accident cost. There are two types of parameter estimates used in this study, which are maximum likelihood (MLE) and robust estimation. The findings of this study reveal that accident cost ratio for fatal and non-fatal claim when using MLE is 1.33, while, for robust estimates, the cost ratio is slightly higher which is 1.51. This study will help the authority to determine a more accurate cost ratio between fatal and non-fatal accident as compared to the official ratio set by the government, since cost ratio is an important element to be used as a weightage in modeling road accident related data. Therefore, this study provides some guidance tips to revise the insurance claim set by the Malaysia road authority, hence the appropriate method

  9. Distributed Optimization System

    Science.gov (United States)

    Hurtado, John E.; Dohrmann, Clark R.; Robinett, III, Rush D.

    2004-11-30

    A search system and method for controlling multiple agents to optimize an objective using distributed sensing and cooperative control. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace. The objective can be: chemical sources, temperature sources, radiation sources, light sources, evaders, trespassers, explosive sources, time dependent sources, time independent sources, function surfaces, maximization points, minimization points, and optimal control of a system such as a communication system, an economy, a crane, and a multi-processor computer.

  10. Distributed Systems Technology Survey.

    Science.gov (United States)

    1987-03-01

    A-0101 953 DISTRIBUTED SYSTEMS TECHNOLOGY SURYEY(U) / CRNEGIE-MELLON UNIY PITTSBURGH PA SOFTWdARE ENGINEERING INST E C COOPER MAR 97 CMU/SEI-87-TR-5...generalization of single-lev atomic transactions, in order to allow them to mesh properly with the concepts of composiion and abstraction supported by program...WORK UNtT PITTSBURGH, PA 15213 ELEMENT NO. NO. No. NO. _______________________________ 63752F N/A N/A N/A 11. TITIE (include Security- Classiiction

  11. DEM - distribution energy management

    Energy Technology Data Exchange (ETDEWEB)

    Seppaelae, A.; Kekkonen, V.; Koreneff, G. [VTT Energy, Espoo (Finland)] [and others

    1998-08-01

    The electricity market was de-regulated in Finland at the end of 1995 and the customers can now freely choose their power suppliers. The national grid and local distribution network operators are now separated from the energy business. The network operators transmit the electric power to the customers on equal terms regardless from whom the power is purchased. The Finnish national grid is owned by one company Finnish Power Grid PLC (Fingrid). The major shareholders of Fingrid are the state of Finland, two major power companies and institutional investors. In addition there are about 100 local distribution utilities operating the local 110 kV, 20 kV and 0.4 kV networks. The distribution utilities are mostly owned by the municipalities and towns. In each network one energy supplier is always responsible for the hourly energy balance in the network (a `host`) and it also has the obligation to provide public energy prices accessible to any customer in the network`s area. The Finnish regulating authorities nominate such a supplier who has a dominant market share in the network`s area as the supplier responsible for the network`s energy balance. A regulating authority, called the Electricity Market Centre, ensures that the market is operating properly. The transmission prices and public energy prices are under the Electricity Market Centre`s control. For domestic and other small customers the cost of hourly metering (ca. 1000 US$) would be prohibitive and therefore the use of conventional energy metering and load models is under consideration by the authorities. Small customer trade with the load models (instead of the hourly energy recording) is scheduled to start in the first half of 1998. In this presentation, the problems of energy management from the standpoint of the energy trading and distributing companies in the new situation are first discussed. The topics covered are: the hourly load data management, the forecasting and estimation of hourly energy demands

  12. FMCG companies specific distribution channels

    Directory of Open Access Journals (Sweden)

    Ioana Barin

    2009-12-01

    Full Text Available Distribution includes all activities undertaken by the producer, alone or in cooperation, since the end of the final finished products or services until they are in possession of consumers. The distribution consists of the following major components: distribution channels or marketing channels, which together form a distribution network; logistics o rphysical distribution. In order to effective achieve, distribution of goods requires an amount of activities and operational processes related to transit of goods from producer to consumer, the best conditions, using existing distribution channels and logistics system. One of the essential functions of a distribution is performing acts of sale, through which, with the actual movement of goods, their change of ownership takes place, that the successive transfer of ownership from producer to consumer. This is an itinerary in the economic cycle of goods, called the distribution channel.

  13. SYVAC3 parameter distribution package

    International Nuclear Information System (INIS)

    Andres, T.; Skeet, A.

    1995-01-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes a software object type (a generalization of a data type) called Parameter Distribution. This object type is used in SYVAC3, and can also be used independently. Parameter Distribution has the following subtypes: beta distribution; binomial distribution; constant distribution; lognormal distribution; loguniform distribution; normal distribution; piecewise uniform distribution; Triangular distribution; and uniform distribution. Some of these distributions can be altered by correlating two parameter distribution objects. This report provides complete specifications for parameter distributions, and also explains how to use them. It should meet the needs of casual users, reviewers, and programmers who wish to add their own subtypes. (author). 30 refs., 75 tabs., 56 figs

  14. State Electricity Regulatory Policy and Distributed Resources: Distribution System Cost Methodologies for Distributed Generation; Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, W.; Cowart, R.; Sedano, R.; Weston, F.; Harrington, C.; Moskovitz, D.

    2002-10-01

    Designing and implementing credit-based pilot programs for distributed resources distribution is a low-cost, low-risk opportunity to find out how these resources can help defer or avoid costly electric power system (utility grid) distribution upgrades. This report describes implementation options for deaveraged distribution credits and distributed resource development zones. Developing workable programs implementing these policies can dramatically increase the deployment of distributed resources in ways that benefit distributed resource vendors, users, and distribution utilities. This report is one in the State Electricity Regulatory Policy and Distributed Resources series developed under contract to NREL (see Annual Technical Status Report of the Regulatory Assistance Project: September 2000-September 2001, NREL/SR-560-32733). Other titles in this series are: (1) Accommodating Distributed Resources in Wholesale Markets, NREL/SR-560-32497; (2) Distributed Resources and Electric System Re liability, NREL/SR-560-32498; (3) Distribution System Cost Methodologies for Distributed Generation, NREL/SR-560-32500; (4) Distribution System Cost Methodologies for Distributed Generation Appendices, NREL/SR-560-32501.

  15. State Electricity Regulatory Policy and Distributed Resources: Distribution System Cost Methodologies for Distributed Generation

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, W.; Cowart, R.; Sedano, R.; Weston, F.; Harrington, C.; Moskovitz, D.

    2002-10-01

    Designing and implementing credit-based pilot programs for distributed resources distribution is a low-cost, low-risk opportunity to find out how these resources can help defer or avoid costly electric power system (utility grid) distribution upgrades. This report describes implementation options for deaveraged distribution credits and distributed resource development zones. Developing workable programs implementing these policies can dramatically increase the deployment of distributed resources in ways that benefit distributed resource vendors, users, and distribution utilities. This report is one in the State Electricity Regulatory Policy and Distributed Resources series developed under contract to NREL (see Annual Technical Status Report of the Regulatory Assistance Project: September 2000-September 2001, NREL/SR-560-32733). Other titles in this series are: (1) Accommodating Distributed Resources in Wholesale Markets, NREL/SR-560-32497; (2) Distributed Resources and Electric System Re liability, NREL/SR-560-32498; (3) Distribution System Cost Methodologies for Distributed Generation, NREL/SR-560-32500; (4) Distribution System Cost Methodologies for Distributed Generation Appendices, NREL/SR-560-32501.

  16. The Operational Risk Assessment for Distribution Network with Distributed Generations

    Science.gov (United States)

    Hua, Xie; Yaqi, Wu; Yifan, Wang; Qian, Sun; Jianwei, Ma

    2017-05-01

    Distribution network is an important part of the power system and is connected to the consumers directly. Many distributed generations that have discontinuous output power are connected in the distribution networks, which may cause adverse impact to the distribution network. Therefore, to ensure the security and reliability of distribution network with numerous distributed generations, the risk analysis is necessary for this kind of distribution networks. After study of stochastic load flow algorithm, this paper applies it in the static security risk assessment. The wind and photovoltaic output probabilistic model are built. The voltage over-limit is chosen to calculate the risk indicators. As a case study, the IEEE 33 system is simulated for analyzing impact of distributed generations on system risk in the proposed method.

  17. Distributed coordination of energy storage with distributed generators

    NARCIS (Netherlands)

    Yang, Tao; Wu, Di; Stoorvogel, Antonie Arij; Stoustrup, Jakob

    2016-01-01

    With a growing emphasis on energy efficiency and system flexibility, a great effort has been made recently in developing distributed energy resources (DER), including distributed generators and energy storage systems. This paper first formulates an optimal DER coordination problem considering

  18. The Defense Distribution Center's Future Role in Theater Distribution Operations

    National Research Council Canada - National Science Library

    Newton, Clayton T

    2007-01-01

    ... the requirements of our National Military Strategy. In addition to exploring how the Defense Distribution Center's far reaching capabilities and distribution related core competencies present a unique opportunity to significantly improve intra-theater...

  19. Distributed System Design Checklist

    Science.gov (United States)

    Hall, Brendan; Driscoll, Kevin

    2014-01-01

    This report describes a design checklist targeted to fault-tolerant distributed electronic systems. Many of the questions and discussions in this checklist may be generally applicable to the development of any safety-critical system. However, the primary focus of this report covers the issues relating to distributed electronic system design. The questions that comprise this design checklist were created with the intent to stimulate system designers' thought processes in a way that hopefully helps them to establish a broader perspective from which they can assess the system's dependability and fault-tolerance mechanisms. While best effort was expended to make this checklist as comprehensive as possible, it is not (and cannot be) complete. Instead, we expect that this list of questions and the associated rationale for the questions will continue to evolve as lessons are learned and further knowledge is established. In this regard, it is our intent to post the questions of this checklist on a suitable public web-forum, such as the NASA DASHLink AFCS repository. From there, we hope that it can be updated, extended, and maintained after our initial research has been completed.

  20. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  1. Differentially Private Distributed Sensing

    Energy Technology Data Exchange (ETDEWEB)

    Fink, Glenn A.

    2016-12-11

    The growth of the Internet of Things (IoT) creates the possibility of decentralized systems of sensing and actuation, potentially on a global scale. IoT devices connected to cloud networks can offer Sensing and Actuation as a Service (SAaaS) enabling networks of sensors to grow to a global scale. But extremely large sensor networks can violate privacy, especially in the case where IoT devices are mobile and connected directly to the behaviors of people. The thesis of this paper is that by adapting differential privacy (adding statistically appropriate noise to query results) to groups of geographically distributed sensors privacy could be maintained without ever sending all values up to a central curator and without compromising the overall accuracy of the data collected. This paper outlines such a scheme and performs an analysis of differential privacy techniques adapted to edge computing in a simulated sensor network where ground truth is known. The positive and negative outcomes of employing differential privacy in distributed networks of devices are discussed and a brief research agenda is presented.

  2. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  3. Parton Distributions: Summary Report

    CERN Document Server

    Dittmar, M; Altarelli, Guido; Andersen, J; Ball, R D; Blümlein, J; Böttcher, Helmut B; Carli, T; Ciafaloni, Marcello; Colferai, D; Cooper-Sarkar, A; Corcella, Gennaro; Del Debbio, L; Dissertori, G; Feltesse, J; Forte, S; Glazov, A; Guffanti, A; Gwenlan, C; Huston, J; Ingelman, G; Klein, M; Lastoviicka, T; Lastoviicka-Medin, G; Latorre, J I; Magnea, L; Moch, S; Piccione, A; Pumplin, J; Ravindran, V; Reisert, B; Rojo, J; Salam, Gavin P; Siegert, F; Stasto, A M; Stenzel, H; Targett-Adams, C; Thorne, R S; Tricoli, A; Sabio Vera, Agustin; Vermaseren, J A M; Vogt, A

    2005-01-01

    We provide an assessment of the impact of parton distributions on the determination of LHC processes, and of the accuracy with which parton distributions (PDFs) can be extracted from data, in particular from current and forthcoming HERA experiments. We give an overview of reference LHC processes and their associated PDF uncertainties, and study in detail W and Z production at the LHC. We discuss the precision which may be obtained from the analysis of existing HERA data, tests of consistency of HERA data from different experiments, and the combination of these data. We determine further improvements on PDFs which may be obtained from future HERA data (including measurements of $F_L$), and from combining present and future HERA data with present and future hadron collider data. We review the current status of knowledge of higher (NNLO) QCD corrections to perturbative evolution and deep-inelastic scattering, and provide reference results for their impact on parton evolution, and we briefly examine non-perturbat...

  4. Optimizing electrical distribution systems

    International Nuclear Information System (INIS)

    Scott, W.G.

    1990-01-01

    Electrical utility distribution systems are in the middle of an unprecedented technological revolution in planning, design, maintenance and operation. The prime movers of the revolution are the major economic shifts that affect decision making. The major economic influence on the revolution is the cost of losses (technical and nontechnical). The vehicle of the revolution is the computer, which enables decision makers to examine alternatives in greater depth and detail than their predecessors could. The more important elements of the technological revolution are: system planning, computers, load forecasting, analytical systems (primary systems, transformers and secondary systems), system losses and coming technology. The paper is directed towards the rather unique problems encountered by engineers of utilities in developing countries - problems that are being solved through high technology, such as the recent World Bank-financed engineering computer system for Sri Lanka. This system includes a DEC computer, digitizer, plotter and engineering software to model the distribution system via a digitizer, analyse the system and plot single-line diagrams. (author). 1 ref., 4 tabs., 6 figs

  5. Distributed optimal coordination for distributed energy resources in power systems

    DEFF Research Database (Denmark)

    Wu, Di; Yang, Tao; Stoorvogel, A.

    2017-01-01

    Driven by smart grid technologies, distributed energy resources (DERs) have been rapidly developing in recent years for improving reliability and efficiency of distribution systems. Emerging DERs require effective and efficient coordination in order to reap their potential benefits. In this paper......, we consider an optimal DER coordination problem over multiple time periods subject to constraints at both system and device levels. Fully distributed algorithms are proposed to dynamically and automatically coordinate distributed generators with multiple/single storages. With the proposed algorithms...

  6. The exponential age distribution and the Pareto firm size distribution

    OpenAIRE

    Coad, Alex

    2008-01-01

    Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.

  7. Communication Facilities for Distributed Systems

    Directory of Open Access Journals (Sweden)

    V. Barladeanu

    1997-01-01

    Full Text Available The design of physical networks and communication protocols in Distributed Systems can have a direct impact on system efficiency and reliability. This paper tries to identify efficient mechanisms and paradigms for communication in distributed systems.

  8. Scaling of misorientation angle distributions

    DEFF Research Database (Denmark)

    Hughes, D.A.; Chrzan, D.C.; Liu, Q.

    1998-01-01

    The measurement of misorientation angle distributions following different amounts of deformation in cold-rolled aluminum and nickel and compressed stainless steel is reported. The sealing of the dislocation cell boundary misorientation angle distributions is studied. Surprisingly, the distributio...

  9. 2016 Distributed Wind Market Report

    Energy Technology Data Exchange (ETDEWEB)

    Orrell, Alice C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Foster, Nikolas F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Morris, Scott L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Horner, Juliet S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-08-07

    The U.S. Department of Energy's (DOE's) annual Distributed Wind Market Report provides stakeholders with statistics and analysis of the distributed wind market, along with insight into its trends and characteristics.

  10. Process evaluation distributed system

    Science.gov (United States)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  11. Distributed road assessment system

    Science.gov (United States)

    Beer, N. Reginald; Paglieroni, David W

    2014-03-25

    A system that detects damage on or below the surface of a paved structure or pavement is provided. A distributed road assessment system includes road assessment pods and a road assessment server. Each road assessment pod includes a ground-penetrating radar antenna array and a detection system that detects road damage from the return signals as the vehicle on which the pod is mounted travels down a road. Each road assessment pod transmits to the road assessment server occurrence information describing each occurrence of road damage that is newly detected on a current scan of a road. The road assessment server maintains a road damage database of occurrence information describing the previously detected occurrences of road damage. After the road assessment server receives occurrence information for newly detected occurrences of road damage for a portion of a road, the road assessment server determines which newly detected occurrences correspond to which previously detected occurrences of road damage.

  12. Distributed Project Work

    DEFF Research Database (Denmark)

    Borch, Ole; Kirkegaard, B.; Knudsen, Morten

    1998-01-01

    in a distributed fashion over the Internet needs more attention to the interaction protocol since the physical group room is not existing. The purpose in this paper is to develop a method for online project work by using the product: Basic Support for Cooperative Work (BSCV). An analysis of a well-proven protocol......Project work has been used for many years at Aalborg University to improve learning of theory and methods given in courses. In a closed environment where the students are forming a group in a single room, the interaction behaviour is more or less given from the natural life. Group work...... for information exchange in the traditional project environment is performed. A group of teachers and a student group using small project examples test the method. The first test group used a prototype for testing and found the new activity synchronization difficult to adapt, so the method was finally adjusted...

  13. Nuclear Parton Distribution Functions

    Energy Technology Data Exchange (ETDEWEB)

    I. Schienbein, J.Y. Yu, C. Keppel, J.G. Morfin, F. Olness, J.F. Owens

    2009-06-01

    We study nuclear effects of charged current deep inelastic neutrino-iron scattering in the framework of a {chi}{sup 2} analysis of parton distribution functions (PDFs). We extract a set of iron PDFs which are used to compute x{sub Bj}-dependent and Q{sup 2}-dependent nuclear correction factors for iron structure functions which are required in global analyses of free nucleon PDFs. We compare our results with nuclear correction factors from neutrino-nucleus scattering models and correction factors for charged-lepton--iron scattering. We find that, except for very high x{sub Bj}, our correction factors differ in both shape and magnitude from the correction factors of the models and charged-lepton scattering.

  14. Power distribution arrangement

    DEFF Research Database (Denmark)

    2010-01-01

    An arrangement and a method for distributing power supplied by a power source to two or more of loads (e.g., electrical vehicular systems) is disclosed, where a representation of the power taken by a particular one of the loads from the source is measured. The measured representation of the amount...... of power taken from the source by the particular one of the loads is compared to a threshold to provide an overload signal in the event the representation exceeds the threshold. Control signals dependant on the occurring of the overload signal are provided such that the control signal decreases the output...... power of the power circuit in case the overload signal occurs...

  15. Distributive outcomes matter

    DEFF Research Database (Denmark)

    Svenningsen, Lea Skræp

    , Western Europe, Southeast Asia and Sub-Saharan Africa. For each participant, one policy choice was drawn at random to be realised and the total amount donated by participants was used to purchase and withdraw CO2 quotas and credits in the European Emission Trading Scheme and as a donation to the UN...... to make 16 donation choices among different climate policy options. The climate policies are described in terms of two main outcome variables, including future effects on income in 2100 and present co-benefits from mitigation action. Both outcomes are described for three specific regions of the world...... Adaptation Fund. A random parameter logit model shows that distributional concerns matter for people when they donate to climate policy and that elements of both inequity aversion and general altruism influence the choice of climate policy. The results underscore the importance of considering preferences...

  16. Distributed usability evaluation

    DEFF Research Database (Denmark)

    Christensen, Lars; Frøkjær, Erik

    2010-01-01

    , providing access to an evaluator (usability expert) and to product developers or managers who want to review the incidents and analyse them. DUE supports evaluation in the development stages from running prototypes and onwards. A case study of the use of DUE in a corporate environment is presented......We present DUE (Distributed Usability Evaluation), a technique for collecting and evaluating usability data. The DUE infrastructure involves a client-server network. A client-based tool resides on the workstation of each user, providing a screen video recording, microphone input of voice commentary....... The study indicates that the DUE technique is effective in terms of low bias, high efficiency, and clear communication of usability issues among users, evaluators and developers. Further, DUE is supporting long-term evaluations making possible empirical studies of learnability....

  17. Laplacians on smooth distributions

    Science.gov (United States)

    Kordyukov, Yu. A.

    2017-10-01

    Let M be a compact smooth manifold equipped with a positive smooth density μ and let H be a smooth distribution endowed with a fibrewise inner product g. We define the Laplacian Δ_H associated with (H,μ,g) and prove that it gives rise to an unbounded self-adjoint operator in L^2(M,μ). Then, assuming that H generates a singular foliation \\mathscr F, we prove that, for any function \\varphi in the Schwartz space \\mathscr S( R), the operator \\varphi(Δ_H) is a smoothing operator in the scale of longitudinal Sobolev spaces associated with \\mathscr F. The proofs are based on pseudodifferential calculus on singular foliations, which was developed by Androulidakis and Skandalis, and on subelliptic estimates for Δ_H. Bibliography: 35 titles.

  18. Density Distribution Sunflower Plots

    Directory of Open Access Journals (Sweden)

    William D. Dupont

    2003-01-01

    Full Text Available Density distribution sunflower plots are used to display high-density bivariate data. They are useful for data where a conventional scatter plot is difficult to read due to overstriking of the plot symbol. The x-y plane is subdivided into a lattice of regular hexagonal bins of width w specified by the user. The user also specifies the values of l, d, and k that affect the plot as follows. Individual observations are plotted when there are less than l observations per bin as in a conventional scatter plot. Each bin with from l to d observations contains a light sunflower. Other bins contain a dark sunflower. In a light sunflower each petal represents one observation. In a dark sunflower, each petal represents k observations. (A dark sunflower with p petals represents between /2-pk k and /2+pk k observations. The user can control the sizes and colors of the sunflowers. By selecting appropriate colors and sizes for the light and dark sunflowers, plots can be obtained that give both the overall sense of the data density distribution as well as the number of data points in any given region. The use of this graphic is illustrated with data from the Framingham Heart Study. A documented Stata program, called sunflower, is available to draw these graphs. It can be downloaded from the Statistical Software Components archive at http://ideas.repec.org/c/boc/bocode/s430201.html . (Journal of Statistical Software 2003; 8 (3: 1-5. Posted at http://www.jstatsoft.org/index.php?vol=8 .

  19. Zhengzhou Distribution Research and Countermeasures

    Directory of Open Access Journals (Sweden)

    Cao Wujun

    2017-01-01

    Full Text Available With the acceleration of urbanization process in China, the city has become an increasingly important part of the logistics. This paper analyzes the current development in Zhengzhou logistics and distribution, and use the SWOT analysis model to analyze the Zhengzhou logistics and distribution strengths, weaknesses, opportunities and challenges presented in Zhengzhou distribution tertiary network architecture, and finally summarized the Zhengzhou logistics countermeasures and suggestions distribution.

  20. Distributed terascale volume visualization using distributed shared virtual memory

    KAUST Repository

    Beyer, Johanna

    2011-10-01

    Table 1 illustrates the impact of different distribution unit sizes, different screen resolutions, and numbers of GPU nodes. We use two and four GPUs (NVIDIA Quadro 5000 with 2.5 GB memory) and a mouse cortex EM dataset (see Figure 2) of resolution 21,494 x 25,790 x 1,850 = 955GB. The size of the virtual distribution units significantly influences the data distribution between nodes. Small distribution units result in a high depth complexity for compositing. Large distribution units lead to a low utilization of GPUs, because in the worst case only a single distribution unit will be in view, which is rendered by only a single node. The choice of an optimal distribution unit size depends on three major factors: the output screen resolution, the block cache size on each node, and the number of nodes. Currently, we are working on optimizing the compositing step and network communication between nodes. © 2011 IEEE.

  1. MULTICHANNEL DISTRIBUTION METER: A VERITABLE ...

    African Journals Online (AJOL)

    eobe

    Keywords: Apartments, Power Theft, Multi-channel, Microcomputers, Distribution. 1. INTRODUCTION. 1. INTRODUCTION. Quality power distribution at the present is a major challenge in many locations in most developing countries [1]. This is as a result of many factors which range from difficulties of how distribution wires.

  2. Statistical distribution of quantum particles

    Indian Academy of Sciences (India)

    In this work, the statistical distribution functions for boson, fermions and their mixtures have been derived and it is found that distribution functions follow the symmetry features of β distribution. If occupation index is greater than unity, then it is easy in the present approach to visualise condensations in terms of intermediate ...

  3. Distributed visualization framework architecture

    Science.gov (United States)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this

  4. Review of criteria for the selection of probability distributions for wind speed data and introduction of the moment and L-moment ratio diagram methods, with a case study

    International Nuclear Information System (INIS)

    Ouarda, T.B.M.J.; Charron, C.; Chebana, F.

    2016-01-01

    Highlights: • Review of criteria used to select probability distributions to model wind speed data. • Classical and L-moment ratio diagrams are applied to wind speed data. • The diagrams allow to select the best distribution to model each wind speed sample. • The goodness-of-fit statistics are more consistent with the L-moment ratio diagram. - Abstract: This paper reviews the different criteria used in the field of wind energy to compare the goodness-of-fit of candidate probability density functions (pdfs) to wind speed records, and discusses their advantages and disadvantages. The moment ratio and L-moment ratio diagram methods are also proposed as alternative methods for the choice of the pdfs. These two methods have the advantage of allowing an easy comparison of the fit of several pdfs for several time series (stations) on a single diagram. Plotting the position of a given wind speed data set in these diagrams is instantaneous and provides more information than a goodness-of-fit criterion since it provides knowledge about such characteristics as the skewness and kurtosis of the station data set. In this paper, it is proposed to study the applicability of these two methods for the selection of pdfs for wind speed data. Both types of diagrams are used to assess the fit of the pdfs for wind speed series in the United Arab Emirates. The analysis of the moment ratio diagrams reveals that the Kappa, Log-Pearson type III and Generalized Gamma are the distributions that fit best all wind speed series. The Weibull represents the best distribution among those with only one shape parameter. Results obtained with the diagrams are compared with those obtained with goodness-of-fit statistics and a good agreement is observed especially in the case of the L-moment ratio diagram. It is concluded that these diagrams can represent a simple and efficient approach to be used as complementary method to goodness-of-fit criteria.

  5. LHCb Distributed Conditions Database

    CERN Document Server

    Clemencic, Marco

    2007-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCB library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica o...

  6. Distributed Merge Trees

    Energy Technology Data Exchange (ETDEWEB)

    Morozov, Dmitriy; Weber, Gunther

    2013-01-08

    Improved simulations and sensors are producing datasets whose increasing complexity exhausts our ability to visualize and comprehend them directly. To cope with this problem, we can detect and extract significant features in the data and use them as the basis for subsequent analysis. Topological methods are valuable in this context because they provide robust and general feature definitions. As the growth of serial computational power has stalled, data analysis is becoming increasingly dependent on massively parallel machines. To satisfy the computational demand created by complex datasets, algorithms need to effectively utilize these computer architectures. The main strength of topological methods, their emphasis on global information, turns into an obstacle during parallelization. We present two approaches to alleviate this problem. We develop a distributed representation of the merge tree that avoids computing the global tree on a single processor and lets us parallelize subsequent queries. To account for the increasing number of cores per processor, we develop a new data structure that lets us take advantage of multiple shared-memory cores to parallelize the work on a single node. Finally, we present experiments that illustrate the strengths of our approach as well as help identify future challenges.

  7. Distributed Deliberative Recommender Systems

    Science.gov (United States)

    Recio-García, Juan A.; Díaz-Agudo, Belén; González-Sanz, Sergio; Sanchez, Lara Quijano

    Case-Based Reasoning (CBR) is one of most successful applied AI technologies of recent years. Although many CBR systems reason locally on a previous experience base to solve new problems, in this paper we focus on distributed retrieval processes working on a network of collaborating CBR systems. In such systems, each node in a network of CBR agents collaborates, arguments and counterarguments its local results with other nodes to improve the performance of the system's global response. We describe D2ISCO: a framework to design and implement deliberative and collaborative CBR systems that is integrated as a part of jcolibritwo an established framework in the CBR community. We apply D2ISCO to one particular simplified type of CBR systems: recommender systems. We perform a first case study for a collaborative music recommender system and present the results of an experiment of the accuracy of the system results using a fuzzy version of the argumentation system AMAL and a network topology based on a social network. Besides individual recommendation we also discuss how D2ISCO can be used to improve recommendations to groups and we present a second case of study based on the movie recommendation domain with heterogeneous groups according to the group personality composition and a group topology based on a social network.

  8. LHCb distributed conditions database

    International Nuclear Information System (INIS)

    Clemencic, M

    2008-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here

  9. Protection of Distribution Systems with Distributed Energy Resources

    DEFF Research Database (Denmark)

    Bak-Jensen, Birgitte; Browne, Matthew; Calone, Roberto

    The usage of Distributed Energy Resources (DER) in utilities around the world is expected to increase significantly. The existing distribution systems have been generally designed for unidirectional power flow, and feeders are opened and locked out for any fault within. However, in the future...... this practice may lead to a loss of significant generation where each feeder may have significant DER penetration. Also, utilities have started to investigate islanding operation of distribution systems with DER as a way to improve the reliability of the power supply to customers. This report is the result...... of 17 months of work of the Joint Working Group B5/C6.26/CIRED “Protection of Distribution Systems with Distributed Energy Resources”. The working group used the CIGRE report TB421 “The impact of Renewable Energy Sources and Distributed Generation on Substation Protection and Automation”, published...

  10. Distribution effects of electricity tax illustrated by different distribution concepts

    International Nuclear Information System (INIS)

    Halvorsen, Bente; Larsen, Bodil M.; Nesbakken, Runa

    2001-01-01

    This study demonstrates the significance of the choice of distribution concepts in analyses of distribution effects of electricity tax. By distribution effects are meant that life circumstances are changing. The focus is on different income concepts. Income is an important element in the life circumstances of the households. The distribution effects are studied by focusing on general income before and after tax, pension able earnings before and after tax and total consumption expenditure. The authors study how increased electricity expenses caused by a proportional increase of the electricity tax affect the households in various income groups. It is found that the burden of such an increased tax, measured by the budget part set aside for electricity, decreases with income no matter what distribution concept is used. By calculating measures of inequality for income minus electricity tax before and after the tax increase, it is found that the measures of inequality significantly depend on the choice of distribution concept

  11. Income Distribution Impacts of Irrigation Water Distribution Policy

    Science.gov (United States)

    Sampath, Rajan K.

    1984-06-01

    In the majority of lesser developed countries (LDC's) there is acute inequality in income distribution in the rural sector, particularly between large and small farms on the one hand and between land owners and the landless on the other. Irrigation water distribution policy of the government is both an economic and political problem. It has both equity and efficiency implications. It has effects on both the level and distribution of income. This paper deals with the conditions under which using water redistribution as an effective governmental policy variable can reduce inequality in the distribution of income. This paper also deals with the relationship between the objectives of equity and efficiency in water distribution under different objective realities, such as dualistic versus nondualistic conditions, two-sector versus three-sector modeling, optimum versus equal water distribution, specifically to derive the conditions under which promotion of equity promotes efficiency and vice versa and the conditions under which it does not.

  12. Performance of Distributed CFAR Processors in Pearson Distributed Clutter

    Directory of Open Access Journals (Sweden)

    Faouzi Soltani

    2007-01-01

    Full Text Available This paper deals with the distributed constant false alarm rate (CFAR radar detection of targets embedded in heavy-tailed Pearson distributed clutter. In particular, we extend the results obtained for the cell averaging (CA, order statistics (OS, and censored mean level CMLD CFAR processors operating in positive alpha-stable (P&S random variables to more general situations, specifically to the presence of interfering targets and distributed CFAR detectors. The receiver operating characteristics of the greatest of (GO and the smallest of (SO CFAR processors are also determined. The performance characteristics of distributed systems are presented and compared in both homogeneous and in presence of interfering targets. We demonstrate, via simulation results, that the distributed systems when the clutter is modelled as positive alpha-stable distribution offer robustness properties against multiple target situations especially when using the “OR” fusion rule.

  13. Performance of Distributed CFAR Processors in Pearson Distributed Clutter

    Directory of Open Access Journals (Sweden)

    Messali Zoubeida

    2007-01-01

    Full Text Available This paper deals with the distributed constant false alarm rate (CFAR radar detection of targets embedded in heavy-tailed Pearson distributed clutter. In particular, we extend the results obtained for the cell averaging (CA, order statistics (OS, and censored mean level CMLD CFAR processors operating in positive alpha-stable (P&S random variables to more general situations, specifically to the presence of interfering targets and distributed CFAR detectors. The receiver operating characteristics of the greatest of (GO and the smallest of (SO CFAR processors are also determined. The performance characteristics of distributed systems are presented and compared in both homogeneous and in presence of interfering targets. We demonstrate, via simulation results, that the distributed systems when the clutter is modelled as positive alpha-stable distribution offer robustness properties against multiple target situations especially when using the "OR" fusion rule.

  14. Transformation of Bayesian posterior distribution into a basic analytical distribution

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2002-01-01

    Bayesian estimation is well-known approach that is widely used in Probabilistic Safety Analyses for the estimation of input model reliability parameters, such as component failure rates or probabilities of failure upon demand. In this approach, a prior distribution, which contains some generic knowledge about a parameter is combined with likelihood function, which contains plant-specific data about the parameter. Depending on the type of prior distribution, the resulting posterior distribution can be estimated numerically or analytically. In many instances only a numerical Bayesian integration can be performed. In such a case the posterior is provided in the form of tabular discrete distribution. On the other hand, it is much more convenient to have a parameter's uncertainty distribution that is to be input into a PSA model to be provided in the form of some basic analytical probability distribution, such as lognormal, gamma or beta distribution. One reason is that this enables much more convenient propagation of parameters' uncertainties through the model up to the so-called top events, such as plant system unavailability or core damage frequency. Additionally, software tools used to run PSA models often require that parameter's uncertainty distribution is defined in the form of one among the several allowed basic types of distributions. In such a case the posterior distribution that came as a product of Bayesian estimation needs to be transformed into an appropriate basic analytical form. In this paper, some approaches on transformation of posterior distribution to a basic probability distribution are proposed and discussed. They are illustrated by an example from NPP Krsko PSA model.(author)

  15. Multicriteria Reconfiguration of Distribution Network with Distributed Generation

    OpenAIRE

    Voropai, N. I.; Bat-Undraal, B.

    2012-01-01

    The paper addresses the problem of multicriteria reconfiguration of distribution network with distributed generation according to the criterion of minimum power loss under normal conditions and the criterion of power supply reliability under postemergency conditions. Efficient heuristic and multicriteria methods are used to solve the problem including advanced ant colony algorithm for minimum loss reconfiguration of distribution network, the sorting-out algorithm of cell formation for island ...

  16. Superclusters and hadronic multiplicity distributions

    International Nuclear Information System (INIS)

    Shih, C.C.; Carruthers, P.

    1986-01-01

    The multiplicity distribution is expressed in terms of supercluster production in hadronic processes at high energy. This process creates unstable clusters at intermediate stages and hadrons in final stage. It includes Poisson-transform distributions (with the partially coherent distribution as a special case) and is very flexible for phenomenological analyses. The associated Koba, Nielson, and Olesen limit and the behavior of cumulant moments are analyzed in detail for finite and/or infinite cluster size and particle size per cluster. In general, a supercluster distribution does not need to be equivalent to a negative binomial distribution to fit experimental data well. Furthermore, the requirement of such equivalence leads to many solutions, in which the average size of the cluster is not logarithmic: e.g., it may show a power behavior instead. Superclustering is defined as a two-or multi-stage process underlying observed global multiplicity distributions. At the first stage of the production process, individual clusters are produced according to a given statistical law. For example, the clustering distribution may be described by partially coherent (oreven sub-Poissonian distribution models. At the second stage, the clusters are considered as the sources of particle production. The corresponding distribution may then be as general as the clustering distribution just mentioned. 8 refs

  17. Loss Allocation in a Distribution System with Distributed Generation Units

    DEFF Research Database (Denmark)

    Lund, Torsten; Nielsen, Arne Hejde; Sørensen, Poul Ejnar

    2007-01-01

    In Denmark, a large part of the electricity is produced by wind turbines and combined heat and power plants (CHPs). Most of them are connected to the network through distribution systems. This paper presents a new algorithm for allocation of the losses in a distribution system with distributed...... generation. The algorithm is based on a reduced impedance matrix of the network and current injections from loads and production units. With the algorithm, the effect of the covariance between production and consumption can be evaluated. To verify the theoretical results, a model of the distribution system...

  18. Distributed Embodied Team Play, a Distributed Interactive Pong Playground

    OpenAIRE

    van Delden, Robertus Wilhelmus; Gerritsen, Steven; Reidsma, Dennis; Heylen, Dirk K.J.; Poppe, Ronald; Meyer, John-Jules; Veltkamp, Remco; Destani, Mehdi

    2016-01-01

    This paper presents work in the field of distributed exertion games, which are controlled by moving the body. People play these games together while being located at different places in the world. The novel contribution of this paper is the introduction of distributed team play in which both collocated and distributed players participate. In our Distributed Interactive Pong Playground (DIPP) players bounce a ball towards a goal by moving, walking, and running around in a 5.3 by 5.3 meter inte...

  19. Constraining the double gluon distribution by the single gluon distribution

    Energy Technology Data Exchange (ETDEWEB)

    Golec-Biernat, Krzysztof [Institute of Nuclear Physics Polish Academy of Sciences, 31-342 Cracow (Poland); Faculty of Mathematics and Natural Sciences, University of Rzeszów, 35-959 Rzeszów (Poland); Lewandowska, Emilia; Serino, Mirko [Institute of Nuclear Physics Polish Academy of Sciences, 31-342 Cracow (Poland); Snyder, Zachary [Penn State University, University Park, PA 16802 (United States); Staśto, Anna M., E-mail: astasto@phys.psu.edu [Institute of Nuclear Physics Polish Academy of Sciences, 31-342 Cracow (Poland); Penn State University, University Park, PA 16802 (United States)

    2015-11-12

    We show how to consistently construct initial conditions for the QCD evolution equations for double parton distribution functions in the pure gluon case. We use to momentum sum rule for this purpose and a specific form of the known single gluon distribution function in the MSTW parameterization. The resulting double gluon distribution satisfies exactly the momentum sum rule and is parameter free. We also study numerically its evolution with a hard scale and show the approximate factorization into product of two single gluon distributions at small values of x, whereas at large values of x the factorization is always violated in agreement with the sum rule.

  20. 75 FR 8920 - Grant of Authority for Subzone Status; IKEA Distribution Services (Distribution of Home...

    Science.gov (United States)

    2010-02-26

    ... Status; IKEA Distribution Services (Distribution of Home Furnishings and Accessories); Baltimore, MD... subzone at the warehouse and distribution facility of IKEA Distribution Services, located in Perryville... and distribution at the facility of IKEA Distribution Services, located in Perryville, Maryland...

  1. CEATI distribution roadmap : what's next?

    International Nuclear Information System (INIS)

    2006-01-01

    The future of the electric distribution utility environment over the next 20 years was discussed. A study was conducted to assist utilities in developing future implementation plans. Twenty-one scenarios were created in order to obtain a list of technologies that may impact the future of the distribution grid. Scenarios considered potential policies and regulation, and investigated technologies required to implement each scenario. The scenarios considered future energy markets; business environments; distribution assets; and workforce developments. A distribution value chain classification was used to identify potential synergies. Results of the study showed that distributed resources will become more important in the next 20 years. Employee and system safety will require active consideration as the electricity grid becomes more complex. A second phase of the project will identify key technologies, common infrastructure needs, and guidelines for transforming distribution utilities in the future

  2. Method of forecasting power distribution

    International Nuclear Information System (INIS)

    Kaneto, Kunikazu.

    1981-01-01

    Purpose: To obtain forecasting results at high accuracy by reflecting the signals from neutron detectors disposed in the reactor core on the forecasting results. Method: An on-line computer transfers, to a simulator, those process data such as temperature and flow rate for coolants in each of the sections and various measuring signals such as control rod positions from the nuclear reactor. The simulator calculates the present power distribution before the control operation. The signals from the neutron detectors at each of the positions in the reactor core are estimated from the power distribution and errors are determined based on the estimated values and the measured values to determine the smooth error distribution in the axial direction. Then, input conditions at the time to be forecast are set by a data setter. The simulator calculates the forecast power distribution after the control operation based on the set conditions. The forecast power distribution is corrected using the error distribution. (Yoshino, Y.)

  3. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  4. Degree distribution in discrete case

    International Nuclear Information System (INIS)

    Wang, Li-Na; Chen, Bin; Yan, Zai-Zai

    2011-01-01

    Vertex degree of many network models and real-life networks is limited to non-negative integer. By means of measure and integral, the relation of the degree distribution and the cumulative degree distribution in discrete case is analyzed. The degree distribution, obtained by the differential of its cumulative, is only suitable for continuous case or discrete case with constant degree change. When degree change is not a constant but proportional to degree itself, power-law degree distribution and its cumulative have the same exponent and the mean value is finite for power-law exponent greater than 1. -- Highlights: → Degree change is the crux for using the cumulative degree distribution method. → It suits for discrete case with constant degree change. → If degree change is proportional to degree, power-law degree distribution and its cumulative have the same exponent. → In addition, the mean value is finite for power-law exponent greater than 1.

  5. Exploring Trajectories of Distributed Development

    DEFF Research Database (Denmark)

    Slepniov, Dmitrij; Wæhrens, Brian Vejrum; Niang, Mohamed

    2014-01-01

    While some firms have successfully turned their global operations into a formidable source of competitive advantage, others have failed to do so. A lot depends on which activities are globally distributed and how they are configured and coordinated. Emerging body of literature and practice suggest...... of practices used by the companies in order to achieve control and coordination of distributed development activities. Three propositions are developed to advance our understanding of the continual search for an optimal organizational form for managing distributed development....

  6. On the Folded Normal Distribution

    Directory of Open Access Journals (Sweden)

    Michail Tsagris

    2014-02-01

    Full Text Available The characteristic function of the folded normal distribution and its moment function are derived. The entropy of the folded normal distribution and the Kullback–Leibler from the normal and half normal distributions are approximated using Taylor series. The accuracy of the results are also assessed using different criteria. The maximum likelihood estimates and confidence intervals for the parameters are obtained using the asymptotic theory and bootstrap method. The coverage of the confidence intervals is also examined.

  7. Moment Distributions of Phase Type

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2011-01-01

    Moment distributions of phase-type and matrix-exponential distributions are shown to remain within their respective classes. We provide a probabilistic phase-type representation for the former case and an alternative representation, with an analytically appealing form, for the latter. First order...... moment distributions are of special interest in areas like demography and economics, and we calculate explicit formulas for the Lorenz curve and Gini index used in these disciplines....

  8. Statistical distribution of quantum particles

    Science.gov (United States)

    Khasare, S. B.; Khasare, Shashank S.

    2018-03-01

    In this work, the statistical distribution functions for boson, fermions and their mixtures have been derived and it is found that distribution functions follow the symmetry features of β distribution. If occupation index is greater than unity, then it is easy in the present approach to visualise condensations in terms of intermediate values of mixing parameters. There are some applications of intermediate values of mixing parameters.

  9. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  10. On positivity of parton distributions

    International Nuclear Information System (INIS)

    Altarelli, G.; Forte, S.; Ridolfi, G.

    1998-01-01

    We discuss the bounds on polarized parton distributions which follow from their definition in terms of cross section asymmetries. We spell out how the bounds obtained in the naive parton model can be derived within perturbative QCD at leading order when all quark and gluon distributions are defined in terms of suitable physical processes. We specify a convenient physical definition for the polarized and unpolarized gluon distributions in terms of Higgs production from gluon fusion. We show that these bounds are modified by subleading corrections, and we determine them up to NLO. We examine the ensuing phenomenological implications, in particular in view of the determination of the polarized gluon distribution. (orig.)

  11. On positivity of parton distributions

    CERN Document Server

    Altarelli, Guido; Ridolfi, G; Altarelli, Guido; Forte, Stefano; Ridolfi, Giovanni

    1998-01-01

    We discuss the bounds on polarized parton distributions which follow from their definition in terms of cross section asymmetries. We spell out how the bounds obtained in the naive parton model can be derived within perturbative QCD at leading order when all quark and gluon distributions are defined in terms of suitable physical processes. We specify a convenient physical definition for the polarized and unpolarized gluon distributions in terms of Higgs production from gluon fusion. We show that these bounds are modified by subleading corrections, and we determine them up to NLO. We examine the ensuing phenomenological implications, in particular in view of the determination of the polarized gluon distribution.

  12. Theoretically Optimal Distributed Anomaly Detection

    Data.gov (United States)

    National Aeronautics and Space Administration — A novel general framework for distributed anomaly detection with theoretical performance guarantees is proposed. Our algorithmic approach combines existing anomaly...

  13. Beam distribution function after filamentation

    Energy Technology Data Exchange (ETDEWEB)

    Raubenheimer, T.O.; Decker, F.J.; Seeman, J.T.

    1995-05-01

    In this paper, the authors calculate the beam distribution function after filamentation (phase-mixing) of a focusing mismatch. This distribution is relevant when interpreting beam measurements and sources of emittance dilution in linear colliders. It is also important when considering methods of diluting the phase space density, which may be required for the machine protection system in future linear colliders, and it is important when studying effects of trapped ions which filament in the electron beam potential. Finally, the resulting distribution is compared with measured beam distributions from the SLAC linac.

  14. On Size Biased Kumaraswamy Distribution

    Directory of Open Access Journals (Sweden)

    Dreamlee Sharma

    2016-08-01

    Full Text Available In this paper, we introduce and study the size-biased form of Kumaraswamy distribution. The Kumaraswamy distribution which has drawn considerable attention in hydrology and related areas was proposed by Kumarswamy. The new distribution is derived under sizebiased probability of sampling taking the weights as the variate values. Various distributional and characterizing properties of the model are studied. The methods of maximum likelihood and matching quantiles estimation are employed to estimate the parameters of the proposed model. Finally, we apply the proposed model to simulated and real data sets.

  15. Reliability Evaluation of Distribution System with Distributed Generation

    Science.gov (United States)

    Chen, Guoyan; Zhang, Feng; You, Dahai; Wang, Yong; Lu, Guojun; Zou, Qi; Liu, Hengwei; Qian, Junjie; Xu, Heng

    2017-07-01

    Distribution system reliability assessment is an important part of power system reliability assessment. In recent years, distributed generations (DG) are more and more connected to distribution system because of its flexible and friendly environment features, which imposes a great influence on distribution system reliability. Hence, a reliability evaluation method suitable for distribution system with DG is imperative, which is proposed in this paper. First, a probabilistic model of DG output is established based on the generation characteristics of DG. Second, the island operation mode of distribution system with DG is researched, subsequently, the calculation method of the probability of island successful operation is put forward on the basis of DG model and the load model. Third, a reliability assessment methodology of distribution system with DG is proposed by improving the traditional minimal path algorithm for reliability evaluation of distribution system. Finally, some results are obtained by applying the proposed method to the IEEE-RBTS Bus6 system, which are consistent with the well-known facts. In this way, the proposed method is proved to be reasonable and effective.

  16. The Impact of Distributed Generation on Distribution Networks ...

    African Journals Online (AJOL)

    Their advantages are the ability to reduce or postpone the need for investment in the transmission and distribution infrastructure when optimally located; the ability to reduce technical losses within the transmission and distribution networks as well as general improvement in power quality and system reliability. This paper ...

  17. Islanding Operation of Distribution System with Distributed Generations

    DEFF Research Database (Denmark)

    Mahat, Pukar; Chen, Zhe; Bak-Jensen, Birgitte

    2010-01-01

    The growing interest in distributed generations (DGs) due to environmental concern and various other reasons have resulted in significant penetration of DGs in many distribution system worldwide. DGs come with many benefits. One of the benefits is improved reliability by supplying load during power...

  18. Kaplan-Meier type distributions for linear contact distributions

    NARCIS (Netherlands)

    Hansen, M.B.; Gill, R.D.; Baddeley, A.

    1996-01-01

    The linear contact distribution function is shown to be continuously dierentiable for any stationary random closed set which implies the existence of a continuous density and hazard rate Moreover it is proved that the density is monotone decreasing When the linear contact distribution function is

  19. Distributed Embodied Team Play, a Distributed Interactive Pong Playground

    NARCIS (Netherlands)

    van Delden, Robertus Wilhelmus; Gerritsen, Steven; Reidsma, Dennis; Heylen, Dirk K.J.; Poppe, Ronald; Meyer, John-Jules; Veltkamp, Remco; Destani, Mehdi

    2016-01-01

    This paper presents work in the field of distributed exertion games, which are controlled by moving the body. People play these games together while being located at different places in the world. The novel contribution of this paper is the introduction of distributed team play in which both

  20. Capacity of Distribution Feeders for Hosting Distributed Energy Resources

    DEFF Research Database (Denmark)

    Papathanassiou, S.; Hatziargyriou, N.; Anagnostopoulos, P.

    The last two decades have seen an unprecedented development of distributed energy resources (DER) all over the world. Several countries have adopted a variety of support schemes (feed-in tariffs, green certificates, direct subsidies, tax exemptions etc.) so as to promote distributed generation (DG...

  1. Comparação de modelos matemáticos para o traçado de curvas granulométricas Comparison of mathematical models for fitting particle-size distribution curves

    Directory of Open Access Journals (Sweden)

    Euzebio Medrado da Silva

    2004-04-01

    Full Text Available A distribuição granulométrica de partículas sólidas é essencial para as áreas de material de construção, mecânica dos solos, física dos solos, hidrossedimentologia, entre outras. As técnicas utilizadas na avaliação da distribuição granulométrica de amostras resultam em valores pontuais, dependendo de posterior interpolação para o traçado da curva granulométrica e a obtenção de diâmetros característicos específicos. A transformação de valores pontuais em funções contínuas pode ser realizada por meio de modelos matemáticos. Entretanto, há poucos estudos com a finalidade de determinar o melhor modelo para o ajuste de curvas granulométricas. O objetivo deste trabalho foi testar e comparar 14 diferentes modelos passíveis de utilização no traçado da curva granulométrica de partículas sólidas com base em quatro pontos medidos. O parâmetro de comparação entre os modelos foi a soma de quadrado dos erros entre os valores medidos e calculados. Os modelos mais recomendados no traçado da curva granulométrica, a partir de quatro pontos, são os de Skaggs et al. 3P, Lima & Silva 3P, Weibull 3P e Morgan et al. 3P, todos com três parâmetros de ajuste.Particle-size distribution is fundamental for characterizing construction materials, soil mechanics, soil physics, sediment-flux in rivers, and others. The techniques used to determine the particle-size distribution of a sample are point-wise, demanding posterior interpolation to fit the complete particle-size distribution curve and to obtain values of specific diameters. The transformation of discrete points into continuous functions can be made by mathematical models. However, there are few studies to determine the best model to fit particle-size distribution curves. The objective of this work was to test and compare 14 different models with feasibility to fit the cumulative particle-size distribution curve based on four measured points. The parameter used to compare

  2. Protection of Distribution Systems with Distributed Energy Resources

    DEFF Research Database (Denmark)

    Bak-Jensen, Birgitte; Browne, Matthew; Calone, Roberto

    of 17 months of work of the Joint Working Group B5/C6.26/CIRED “Protection of Distribution Systems with Distributed Energy Resources”. The working group used the CIGRE report TB421 “The impact of Renewable Energy Sources and Distributed Generation on Substation Protection and Automation”, published......The usage of Distributed Energy Resources (DER) in utilities around the world is expected to increase significantly. The existing distribution systems have been generally designed for unidirectional power flow, and feeders are opened and locked out for any fault within. However, in the future...... by WG B5.34 as the entry document for the work on this report. In doing so, the group aligned the content and the scope of this report, the network structures considered, possible islanding, standardized communication and adaptive protection, interface protection, connection schemes and protection...

  3. Current Distribution Mapping for PEMFCs

    International Nuclear Information System (INIS)

    Lilavivat, V.; Shimpalee, S.; Van Zee, J.W.; Xu, H.; Mittelsteadt, C.K.

    2015-01-01

    A developed measurement system for current distribution mapping has enabled a new approach for operational measurements in proton exchange membrane fuel cells (PEMFCs). Currently, there are many issues with the methods to measure current distribution; some of the problems that arise are breaking up the fuel cell component and these measurements are costly. Within this field of work, there is a cost effective method and an easy technique of mapping the current distribution within a fuel cell while not disrupting reactant flow. The physical setup of this method takes a current distribution board and inserts it between an anode flow field plate and a gas diffusion layer. From this layout, the current distribution can be directly measured from the current distribution board. This novel technique can be simply applied to different fuel cell hardware. Further it also can be used in fuel cell stack by inserting multiple current distribution boards into the stack cells. The results from the current distribution measurements and the electrochemical predictions from computational fluid dynamics modeling were used to analyze water transports inside the fuel cell. This developed system can be a basis for a good understanding of optimization for fuel cell design and operation mode

  4. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  5. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models...

  6. Brine Distribution after Vacuum Saturation

    DEFF Research Database (Denmark)

    Hedegaard, Kathrine; Andersen, Bertel Lohmann

    1999-01-01

    Experiments with the vacuum saturation method for brine in plugs of chalk showed that a homogeneous distribution of brine cannot be ensured at saturations below 20% volume. Instead of a homogeneous volume distribution the brine becomes concentrated close to the surfaces of the plugs...

  7. Water Treatment Technology - Distribution Systems.

    Science.gov (United States)

    Ross-Harrington, Melinda; Kincaid, G. David

    One of twelve water treatment technology units, this student manual on distribution systems provides instructional materials for six competencies. (The twelve units are designed for a continuing education training course for public water supply operators.) The competencies focus on the following areas: types of pipe for distribution systems, types…

  8. Easy and flexible mixture distributions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Mabit, Stefan L.

    2013-01-01

    We propose a method to generate flexible mixture distributions that are useful for estimating models such as the mixed logit model using simulation. The method is easy to implement, yet it can approximate essentially any mixture distribution. We test it with good results in a simulation study...

  9. Characterizations of univariate continuous distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2017-01-01

    Provides in an organized manner characterizations of univariate probability distributions with many new results published in this area since the 1978 work of Golambos & Kotz "Characterizations of Probability Distributions" (Springer), together with applications of the theory in model fitting and predictions.

  10. Reduplication and Distributivity in Kannada

    Science.gov (United States)

    Anderson, Janet Katherine

    2012-01-01

    Reduplication of numerals and pronouns in Kannada is shown to be subject to locality conditions similar to those constraining binding. This dissertation explores an account of distributivity which exploits the similarity to binding, arguing that the source of the distributive reading in Numeral Reduplication is a bound element. [The dissertation…

  11. Pressure distribution on spinning spinnerets

    Directory of Open Access Journals (Sweden)

    Zhang Li

    2013-01-01

    Full Text Available A two-dimensional model is used to study the pressure distribution in a chamber of a spinneret system. Darcy’s law is adopted for determining the inlet and outlet velocities of the flow. The pressure distribution on the spinneret plate is obtained, and the dead zone, where no nozzle exists, can be optimally determined.

  12. Authority, Power and Distributed Leadership

    Science.gov (United States)

    Woods, Philip A.

    2016-01-01

    A much greater understanding is needed of power in the practice of distributed leadership. This article explores how the concept of social authority might be helpful in achieving this. It suggests that the practice of distributed leadership is characterized by multiple authorities which are constructed in the interactions between people. Rather…

  13. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...

  14. Multisensor estimation: New distributed algorithms

    Directory of Open Access Journals (Sweden)

    Plataniotis K. N.

    1997-01-01

    Full Text Available The multisensor estimation problem is considered in this paper. New distributed algorithms, which are able to locally process the information and which deliver identical results to those generated by their centralized counterparts are presented. The algorithms can be used to provide robust and computationally efficient solutions to the multisensor estimation problem. The proposed distributed algorithms are theoretically interesting and computationally attractive.

  15. eqpair: Electron energy distribution calculator

    Science.gov (United States)

    Coppi, Paolo S.

    2018-02-01

    eqpair computes the electron energy distribution resulting from a balance between heating and direct acceleration of particles, and cooling processes. Electron-positron pair balance, bremstrahlung, and Compton cooling, including external soft photon input, are among the processes considered, and the final electron distribution can be hybrid, thermal, or non-thermal.

  16. Quality monitored distributed voting system

    Science.gov (United States)

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  17. Distributed intelligence versus central laboratory

    International Nuclear Information System (INIS)

    Halling, H.

    1983-01-01

    After a discussion of the main tasks in a nuclear laboratory like overall experiment planning, sample production and management, performance of experiments, data processing and report generation the essential disadvantages of centralised systems are shown and the proper measures for overcoming those by distribution are discussed. Finally possible shortcomings due to improper design and management of distributed architectures are shown. (author)

  18. Determinants of Dentists' Geographic Distribution.

    Science.gov (United States)

    Beazoglou, Tryfon J.; And Others

    1992-01-01

    A model for explaining the geographic distribution of dentists' practice locations is presented and applied to particular market areas in Connecticut. Results show geographic distribution is significantly related to a few key variables, including demography, disposable income, and housing prices. Implications for helping students make practice…

  19. Parametrization of nuclear parton distributions

    Indian Academy of Sciences (India)

    data on electron and muon deep inelastic scattering (DIS). The distributions are given at Й2 ... analysis of experimental data. The data are restricted to the inclusive electron and muon deep inelastic ...... [13] Nuclear parton-distribution subroutines could be obtained at the web site: http://www- hs.phys.saga-u.ac.jp. Pramana ...

  20. Distributed generation and distribution market diversity in Europe

    International Nuclear Information System (INIS)

    Lopes Ferreira, H.; Costescu, A.; L'Abbate, A.; Minnebo, P.; Fulli, G.

    2011-01-01

    The unbundling of the electricity power system will play a key role on the deployment of distributed generation (DG) in European distribution systems evolving towards Smart Grids. The present paper firstly reviews the relevant European Union (EU) regulatory framework: specific attention is paid to the concept of unbundling of power distribution sector in Europe. Afterwards, the focus is on the current state of penetration of DG technologies in the EU Member States and the corresponding interrelations with distribution features. A comparison between the unbundling of the distribution and supply markets using econometric indicators such as the Herfindahl-Hirschmann (I HH ) and the Shannon-Wiener (I SW ) indices is then presented. Finally, a comparative analysis between these indices and the current level of penetration of distributed generation in most EU is shown; policy recommendations conclude the paper. - Highlights: →The EU regulatory framework and the concept of unbundling are addressed. →A comparison between the unbundling of the distribution and supply markets is shown. →The Herfindahl-Hirschmann and the Shannon-Wiener econometric indices are applied. →A comparison between the indices and the penetration level of DG in EU is presented. →A comparison between the indices and the penetration level of DG in EU is presented.

  1. Reactor power distribution monitoring device

    International Nuclear Information System (INIS)

    Uematsu, Hitoshi

    1988-01-01

    Purpose: To calculate the power distribution at high accuracy by detecting failures of LPRM and, if they should occur giving a reliable substituent value instead of the counted value. Constitution: Means for detecting the failure of LRRMs and inserting TIP to that position and means for calculating the power distribution in the reactor core based on output signals from TIP are disposed. Power distribution is calculated as usual if no failure is detected. However, if the failure is detected, counted power distribution is calculated by using counted values from TIP instead thereof. In this way, even if several of LRRMs happen to be failed, power distribution can be determined at such a high accuracy as in the normal operation. Further, it is also possible to exactly evaluate the fuel integrity and recognize the reactor core state, to thereby exactly monitor the reactor core. (Horiuchi, T.)

  2. Evolution of broadcast content distribution

    CERN Document Server

    Beutler, Roland

    2017-01-01

    This book discusses opportunities for broadcasters that arise with the advent of broadband networks, both fixed and mobile. It discusses how the traditional way of distributing audio-visual content over broadcasting networks has been complemented by the usage of broadband networks. The author shows how this also gives the possibility to offer new types of interactive or so-called nonlinear services. The book illustrates how change in distribution technology is accelerating the need for broadcasters around the world to adapt their content distribution strategy and how it will impact the portfolios of content they offer. Outlines the shift in broadcast content distribution paradigms and related strategic issues Provides an overview of the new broadcasting ecosystem encompassing new types of content, user habits, expectations, and devices Discusses complementary usage of different distribution technologies and platforms.

  3. Intensity distributions in fiber diffraction

    International Nuclear Information System (INIS)

    Millane, R.P.

    1990-01-01

    The probability distribution of X-ray intensities in fiber diffraction are different from those for single crystals (Wilson statistics) because of the cylindrical averaging of the diffraction data. Stubbs has recently determined the intensity distributions on a fiber diffraction pattern for a fixed number of overlapping Fourier-Bessel terms. Some properties of the amplitude and intensity distributions are derived here. It is shown that the amplitudes and intensities are approximately normally distributed (the distributions being asymptotically normal with increasing number of Fourier-Bessel terms). Improved approximations using an Edgeworth series are derived. Other statistical properties and some asymptotic expansions are also derived, and normalization of fiber diffraction amplitudes is discussed. The accuracies of the normal approximations are illustrated for particular fiber structures, and possible applications of intensity statistics in fiber diffraction are discussed. (orig.)

  4. 2014 Distributed Wind Market Report

    Energy Technology Data Exchange (ETDEWEB)

    Orell, A; Foster, N.

    2015-08-01

    The cover of the 2014 Distributed Wind Market Report.According to the 2014 Distributed Wind Market Report, distributed wind reached a cumulative capacity of almost 1 GW (906 MW) in the United States in 2014, reflecting nearly 74,000 wind turbines deployed across all 50 states, Puerto Rico, and the U.S. Virgin Islands. In total, 63.6 MW of new distributed wind capacity was added in 2014, representing nearly 1,700 units and $170 million in investment across 24 states. In 2014, America's distributed wind energy industry supported a growing domestic industrial base as exports from United States-based small wind turbine manufacturers accounted for nearly 80% of United States-based manufacturers' sales.

  5. Applying Distributed Object Technology to Distributed Embedded Control Systems

    DEFF Research Database (Denmark)

    Jørgensen, Bo Nørregaard; Dalgaard, Lars

    2012-01-01

    In this paper, we describe our Java RMI inspired Object Request Broker architecture MicroRMI for use with networked embedded devices. MicroRMI relieves the software developer from the tedious and error-prone job of writing communication protocols for interacting with such embedded devices. MicroR...... in developing control systems for distributed embedded platforms possessing severe resource restrictions.......RMI supports easy integration of high-level application specific control logic with low-level device specific control logic. Our experience from applying MicroRMI in the context of a distributed robotics control application, clearly demonstrates that it is feasible to use distributed object technology...

  6. Review on Islanding Operation of Distribution System with Distributed Generation

    DEFF Research Database (Denmark)

    Mahat, Pukar; Chen, Zhe; Bak-Jensen, Birgitte

    2011-01-01

    The growing environmental concern and various benefits of distributed generation (DG) have resulted in significant penetration of DG in many distribution systems worldwide. One of the major expected benefits of DG is the improvement in the reliability of power supply by supplying load during power...... outage by operating in an island mode. However, there are many challenges to overcome before islanding operation of a distribution system with DG can become a viable solution in future. This paper reviews some of the major challenges with islanding operation and explores some possible solutions...

  7. Control and operation of distributed generation in distribution systems

    DEFF Research Database (Denmark)

    Mahat, Pukar; Chen, Zhe; Bak-Jensen, Birgitte

    2011-01-01

    Many distribution systems nowadays have significant penetration of distributed generation (DG)and thus, islanding operation of these distribution systems is becoming a viable option for economical and technical reasons. The DG should operate optimally during both grid-connected and island...... algorithm, which uses average rate of change off requency (Af5) and real power shift RPS), in the islanded mode. RPS will increase or decrease the power set point of the generator with increasing or decreasing system frequency, respectively. Simulation results show that the proposed method can operate...

  8. Impact of microbial distributions on food safety

    NARCIS (Netherlands)

    Bassett, J.; Jackson, T.; Jewell, K.; Jongenburger, I.; Zwietering, M.H.

    2010-01-01

    This document discusses mechanisms impacting on physical distributions of microorganisms in foods, characteristics and suitability of frequency distributions employed to model microbial distributions, and the impact of both physical and frequency distributions on illness risk and food safety

  9. Distributed Cognition and Distributed Morality: Agency, Artifacts and Systems.

    Science.gov (United States)

    Heersmink, Richard

    2017-04-01

    There are various philosophical approaches and theories describing the intimate relation people have to artifacts. In this paper, I explore the relation between two such theories, namely distributed cognition and distributed morality theory. I point out a number of similarities and differences in these views regarding the ontological status they attribute to artifacts and the larger systems they are part of. Having evaluated and compared these views, I continue by focussing on the way cognitive artifacts are used in moral practice. I specifically conceptualise how such artifacts (a) scaffold and extend moral reasoning and decision-making processes, (b) have a certain moral status which is contingent on their cognitive status, and (c) whether responsibility can be attributed to distributed systems. This paper is primarily written for those interested in the intersection of cognitive and moral theory as it relates to artifacts, but also for those independently interested in philosophical debates in extended and distributed cognition and ethics of (cognitive) technology.

  10. The Impact of Connecting Distributed Generation to the Distribution System

    Directory of Open Access Journals (Sweden)

    E. V. Mgaya

    2007-01-01

    Full Text Available This paper deals with the general problem of utilizing of renewable energy sources to generate electric energy. Recent advances in renewable energy power generation technologies, e.g., wind and photovoltaic (PV technologies, have led to increased interest in the application of these generation devices as distributed generation (DG units. This paper presents the results of an investigation into possible improvements in the system voltage profile and reduction of system losses when adding wind power DG (wind-DG to a distribution system. Simulation results are given for a case study, and these show that properly sized wind DGs, placed at carefully selected sites near key distribution substations, could be very effective in improving the distribution system voltage profile and reducing power losses, and hence could  improve the effective capacity of the system. 

  11. Integrated Transmission and Distribution Control

    Energy Technology Data Exchange (ETDEWEB)

    Kalsi, Karanjit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fuller, Jason C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lian, Jianming [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Wei [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Marinovici, Laurentiu D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fisher, Andrew R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chassin, Forrest S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hauer, Matthew L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-01-01

    Distributed, generation, demand response, distributed storage, smart appliances, electric vehicles and renewable energy resources are expected to play a key part in the transformation of the American power system. Control, coordination and compensation of these smart grid assets are inherently interlinked. Advanced control strategies to warrant large-scale penetration of distributed smart grid assets do not currently exist. While many of the smart grid technologies proposed involve assets being deployed at the distribution level, most of the significant benefits accrue at the transmission level. The development of advanced smart grid simulation tools, such as GridLAB-D, has led to a dramatic improvement in the models of smart grid assets available for design and evaluation of smart grid technology. However, one of the main challenges to quantifying the benefits of smart grid assets at the transmission level is the lack of tools and framework for integrating transmission and distribution technologies into a single simulation environment. Furthermore, given the size and complexity of the distribution system, it is crucial to be able to represent the behavior of distributed smart grid assets using reduced-order controllable models and to analyze their impacts on the bulk power system in terms of stability and reliability.

  12. The Distributed Wind Cost Taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, Trudy; Jimenez, Tony; Preus, Robert; Tegen, Suzanne; Baring-Gould, Ian

    2017-03-28

    To date, there has been no standard method or tool to analyze the installed and operational costs for distributed wind turbine systems. This report describes the development of a classification system, or taxonomy, for distributed wind turbine project costs. The taxonomy establishes a framework to help collect, sort, and compare distributed wind cost data that mirrors how the industry categorizes information. The taxonomy organizes costs so they can be aggregated from installers, developers, vendors, and other sources without losing cost details. Developing a peer-reviewed taxonomy is valuable to industry stakeholders because a common understanding the details of distributed wind turbine costs and balance of station costs is a first step to identifying potential high-value cost reduction opportunities. Addressing cost reduction potential can help increase distributed wind's competitiveness and propel the U.S. distributed wind industry forward. The taxonomy can also be used to perform cost comparisons between technologies and track trends for distributed wind industry costs in the future. As an initial application and piloting of the taxonomy, preliminary cost data were collected for projects of different sizes and from different regions across the contiguous United States. Following the methods described in this report, these data are placed into the established cost categories.

  13. Power Generation and Distribution via Distributed Coordination Control

    OpenAIRE

    Kim, Byeong-Yeon; Oh, Kwang-Kyo; Ahn, Hyo-Sung

    2014-01-01

    This paper presents power coordination, power generation, and power flow control schemes for supply-demand balance in distributed grid networks. Consensus schemes using only local information are employed to generate power coordination, power generation and power flow control signals. For the supply-demand balance, it is required to determine the amount of power needed at each distributed power node. Also due to the different power generation capacities of each power node, coordination of pow...

  14. Distribution planning with reliability options for distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Trebolle, David [Union Fenosa Distribucion, C/Antonio Lopez, 19, 28026 Madrid (Spain); Gomez, Tomas; Cossent, Rafael; Frias, Pablo [Instituto de Investigacion Tecnologica, Escuela Tecnica Superior de Ingenieria, Universidad Pontificia Comillas, C/Quintana 21, 28008 Madrid (Spain)

    2010-02-15

    The promotion of electricity generation from renewable energy sources (RES) and combined heat and power (CHP) has resulted in increasing penetration levels of distributed generation (DG). However, large-scale connection of DG involves profound changes in the operation and planning of electricity distribution networks. Distribution System Operators (DSOs) play a key role since these agents have to provide flexibility to their networks in order to integrate DG. Article 14.7 of EU Electricity Directive states that DSOs should consider DG as an alternative to new network investments. This is a challenging task, particularly under the current regulatory framework where DSOs must be legally and functionally unbundled from other activities in the electricity sector. This paper proposes a market mechanism, referred to as reliability options for distributed generation (RODG), which provides DSOs with an alternative to the investment in new distribution facilities. The mechanism proposed allocates the firm capacity required to DG embedded in the distribution network through a competitive auction. Additionally, RODG make DG partly responsible for reliability and provide DG with incentives for a more efficient operation taking into account the network conditions. (author)

  15. Distribution planning with reliability options for distributed generation

    International Nuclear Information System (INIS)

    Trebolle, David; Gomez, Tomas; Cossent, Rafael; Frias, Pablo

    2010-01-01

    The promotion of electricity generation from renewable energy sources (RES) and combined heat and power (CHP) has resulted in increasing penetration levels of distributed generation (DG). However, large-scale connection of DG involves profound changes in the operation and planning of electricity distribution networks. Distribution System Operators (DSOs) play a key role since these agents have to provide flexibility to their networks in order to integrate DG. Article 14.7 of EU Electricity Directive states that DSOs should consider DG as an alternative to new network investments. This is a challenging task, particularly under the current regulatory framework where DSOs must be legally and functionally unbundled from other activities in the electricity sector. This paper proposes a market mechanism, referred to as reliability options for distributed generation (RODG), which provides DSOs with an alternative to the investment in new distribution facilities. The mechanism proposed allocates the firm capacity required to DG embedded in the distribution network through a competitive auction. Additionally, RODG make DG partly responsible for reliability and provide DG with incentives for a more efficient operation taking into account the network conditions. (author)

  16. The Distributed-SDF Domain

    DEFF Research Database (Denmark)

    Cuadrado, Daniel Lázaro; Ravn, Anders Peter; Koch, Peter

    2005-01-01

    that will constitute the distributed platform. In order to make this distributed platform dynamic and transparent for the user, we use JINI as a peer discovery protocol. The server processes register with a lookup service and they are discovered by the Distributed-SDF director whenever a simulation is to be performed...... have different processing times, after every schedule level, a synchronization step is introduced. When an actor is executed remotely, it is embedded in a model with a director and its inputs and outputs are connected to modified relations that make them believe they are embedded in the original model...

  17. Designing Distributed Generation in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Linvill, Carl [Regulatory Assistance Project, Montepelier, VT (United States); Brutkoski, Donna [Regulatory Assistance Project, Montepelier, VT (United States)

    2017-05-15

    Mexico's energy reform will have far-reaching effects on how people produce and consume electricity in the country. Market liberalization will open the door to an increasing number of options for Mexican residential, commercial, and industrial consumers, and distributed generation (DG), which for Mexico includes generators of less than 500 kilowatts (kW) of capacity connected to the distribution network. Distributed generation is an option for consumers who want to produce their own electricity and provide electricity services to others. This report seeks to provide guidance to Mexican officials on designing DG economic and regulatory policies.

  18. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  19. "SEA ARCHER" Distributed Aviation Platform

    National Research Council Canada - National Science Library

    Calvano, Charles

    2001-01-01

    .... The emergence of Unmanned Air Vehicles (UAVs) / Unmanned Combat Air Vehicles (UCAVs), the continued U.S. Navy focus on the littorals, the desire for force distribution, the need for operational cost reductions, and the advent of Network Centric Warfare...

  20. DIAMONDS: Engineering Distributed Object Systems

    National Research Council Canada - National Science Library

    Cheng, Evan

    1997-01-01

    This report describes DIAMONDS, a research project at Syracuse University, that is dedicated to producing both a methodology and corresponding tools to assist in the development of heterogeneous distributed software...